A rant: Will the agent of change for SW stand up? July 29, 2009
Posted by systemleveldesign in Uncategorized.add a comment
How many time have I heard now that software is the growing problem, that software is the savior for the traditional EDA companies, that the software teams are out growing the hardware team, yet the software tools budget are not, …
This week design automation conference is no different; you can hear it from the CEO of the EDA companies to their soldiers, from the executive of semiconductor companies to their soldiers, from the analyst to the press, to even the taxi driver in San Francisco. Software is the problem; we need to do something about it!
But yet when it comes to doing something very little of these players are effectively executing on it. Everything is very much about RTL, optimization of the hardware, etc. The true software solutions are few. The first thing we hear from customers is “yes it is about the software, but what about my hardware problem”. The second is from the companies (trying to protect their legacy) “yes, software is the problem, but let me explain to you how what I have today solves the problem”
Well let’s be realistic, if what exists today from these companies and what is being used by the engineers was solving the problem, then there would be no problem!
Addressing the software problem requires the willingness to change. To change the development process, to retool, to be willing to say I want to invest in these new technologies, to be willing to look at the product life cycle management, to stand up and say “Yes there is a software problem and I will not wait for the storm to pass before I fix it”, “Yes there is a software problem and I will lead my organization to make this difficult but necessary change”.
Are you this engineer, this manager, this executive that will say yes and drive the change? If not get out of the way, the wave is coming anyway.
NASCUG Does It Again at DAC July 28, 2009
Posted by systemleveldesign in Uncategorized.Tags: architecture design, electronic, system, SystemC, TLM, virtual platform, virtualization
add a comment
The North America SystemC User Group (NASCUG) once again held it’s popular SystemC user group meeting at DAC. Attendance was good, with over 100 people attending to hear the OSCI update, education on SystemC standards, and technical papers from users on the application of SystemC, and to participate in an open Town Hall meeting style discussion.
Interestingly, many of the attendees were attending NASCUG for the first time. According to the innovative live survey “clicker” response system, 43% were first time attendees. That said, the majority of the attendees considered themselves intermediate, advanced, or power users of SystemC. Before diving into the papers, OSCI announced progress in the TLM WG, AMS WG, and Synthesis WG activities (read more at www.systemc.org) and gave updates on all WG activities, including the new Configuration, Control, and Inspection (CCI) WG.
Technical presentations covered modeling, software development, architecture design, and verification. For the complete agenda and access to the presentations, visit www.nascug.org.
Proven! ST-Ericsson Virtual Platform usage presentation at DAC 2009 July 28, 2009
Posted by marcseru in Uncategorized.Tags: ARM, cortex, coware, Electronic System Virtualization, ESV, FPGA, virtual platform
add a comment
I listened today to the presentation from Olivier Mielo, a software engineer from ST-Ericsson. It was a very exciting presentation on why virtual platform will be the way to move forward to address the curretn challenges of software development.
In his presentation Olivier talked about the use of a SystemC virtual platform to develop and validate a UMTS layer1 protocol stack. Here are some of the great points that he made.
He first explained the design challenges which included:
- Needing a “Software centric” approach to address multiple use cases
- Develop and validate layer 1 (layer 2/3 ) software covering HSXPA protocol while keeping up and running on the new architecture existing layer 1 UTMS R.99 software
- Reduce the time to get the “final” solution and be confronted to test approval in front of base station as soon as possible as well as be in time for the mass production with customers
He then explained why traditional methods such as FPGA protoypes or stand alone proprietary simulation would not work. Here are his points on FPGA:
- Need a new net list each time the architecture is changing
- Not so easy to spread complex design over many FPGA
- Allows partial visibility on the system through JTAG
- Difficult to debug multi-cores software without breaking real time conditions
They decided to deploy virtual platforms using CoWare technologies and expertise. After explaining their deployment and use, the following conclusion were reached:
- Virtual Platform was available to start software development within a month
- Software development was shorten and completed before RTL
- Early system integration and validation was done (They had a full Virtual Platform test bench with the base station tester from ANITE integrated)
- The powerful debug support made their work much easier and reduced the project risks.
You can listen to his presentation by visiting the CoWare booth #3665 at DAC in San Francisco. There will be replays during the rest of the week.
Big Crowd for CoWare ESV DAC Exhibitor Forum Session July 28, 2009
Posted by systemleveldesign in LTE, system design.Tags: DAC 09, Design Automation Conference, Electronic System Virtualization, ESV
add a comment
Presenting at the first exhibitor forum session at DAC, during some difficult economic times, is always somewhat of a risk. It wasn’t clear how big an audience we should expect. Standing in front of the exhibitor forum seating area, I was waiting with my co-presenters without high hopes. The first 5 people started to get seated 20 minutes before the presentation. And eventually attendees filled the seats … I shouldn’t have worried. Over 100 persons attended our presentation on Addressing the Design Challenges of ARM-Based LTE Mobile Phone Designs Using System Virtualization. Together, with IP model partners ARM, Ltd. and Carbon Design Systems, CoWare explained how CoWare Electronic System Virtualization solutions address the design challenges that hardware and software engineers working on LTE mobile phone designs are facing. Based on the questions from the audience, it is clear that standards like SystemC are very important and that the complexity of their designs demand a fast and flexible solution for multi-core designs.
These items are exactly some of the key strengths of CoWare Electronic System Virtualization solutions. As recent customer success stories prove, not only has CoWare been consistently leading the industry with standards-based ESV solutions, customers are choosing CoWare ESV solutions for their complex multi-core designs. Since this type of design is becoming the de facto standard for today’s electronic devices, virtualization needs will only grow.
Looking back at the audience for the electronic system virtualization exhibitor forum session, interest in system virtualization is already growing and people are more than ever looking at CoWare to help pull in design time, lower costs and enable the supply chain. It is promising to be an interesting DAC.
Show me the customer experience! July 25, 2009
Posted by marcseru in Uncategorized.add a comment
I always find interesting the slew of product announcements around industry tradeshow. DAC is no different, every supplier comes out with the latest and greatest thing, hoping to make a splash and win with his technology. Over the years, the targeted audience has probably figured out that there was not that much into it other than setting up their route to see demos and be littered with the marketing superlatives used to describe the products.
For this year CoWare decided to take another approach by bringing to the audience the Electronic System Virtualization customer experience. First through announcement: last week we announced that NXP Successfully Deploys Integrated CoWare and ARM Cortex Processor-based Solution in its Virtual Prototyping Environment;, and there is more to come; second and probably more important through actual customer presentation at our DAC Booth.
The presentations will include:
- Developing Advanced Wireless Technologies Using CoWare by Interdigital
- Using a Virtual Platform to Develop and Validate a UMTS Layer1 Protocol
Software Stack by ST-Ericsson - Achieving Optimal Cost-performance Balance in Advanced Wireless Modem
Chipsets using Stochastic Simulation by Motorola
Information and registration for these presentations can be found at http://www.coware.com/news/dac09.php
Plan to attend!
What is “Electronic System Virtualization”? July 25, 2009
Posted by marcseru in Uncategorized.add a comment
For people interested in electronic system virtualization, the term virtualization itself may be confusing. There is indeed a lot of different use for the term virtualization. The most well-known one being of course the one made famous by the IT community and companies such as VMWare. As to clarify some of the discussion on this blog, let me define what we mean by Electronic System Virtualization.
We have to consider two elements “Electronic System” and “Virtualization”: “Electronic System” refers to electronic products ranging from processor core and chip to hardware boards, devices (for example a mobile phone, a printer, a router, a medical device, …) and even network of devices (for example a car with a network of ECU). As a result “Electronic System Virtualization” is the emulation/simulation of such electronic systems. It should be noted that how the electronic system is virtualized depends on the use one would have to the virtualization technology. For example one could imagine a virtualization enabling interconnect and memory performance optimization where the interconnect and memory subsystem of the hardware will be represented with a high level of accuracy while traffic generators are used as the electronic system test environment. Another one could imagine a virtualization enabling the development of software, which would define the virtualization as a view of the hardware relevant from a programmer’s perspective only. There are many use cases for Electronic System Virtualization and each use may have specific requirement with regard on how it needs to be virtualized.
So why should you care about electronic system virtualization? Electronic systems complexity is increasing. The costs for defining, developing and deploying such systems have increased significantly. Companies need to have the system right the first time while reducing costs and enabling their go-to-market. “Electronic System Virtualization” provides the infrastructure (tools and methodologies) to address these challenges.
Do you have experience with “Electronic System Virtualization”. Let us know what you think.
“Systems” is an overloaded term July 25, 2009
Posted by marcseru in Uncategorized.add a comment
When someone talks to you about “systems” what image or word comes to your mind? Well, this is obviously a loaded question and at the end of the day the answer depends on who you are.
Having dealt with people selling hardware development tools to semiconductor companies, I would say that most of the time the first word out of their mouth would be “System-on-a-Chip”. And there is nothing wrong with that. Having also dealt with people selling software development tools, their response would probably be more along the lines of the full hardware board or the electronic product being designed. Finally, having also dealt with large electronic product companies, they would definitively argue that the system is much more than all of the above. An automotive OEM will most likely tell you that the car is the system, a networking company will argue that the system is the network and the multitude of components making it.
So where am I going with this … When we use the word “systems”, it is important for the electronic product industry and its ecosystem to have an open mind. In particular, when it comes to system-level design, we always need to understand the perspective of the audience. System-level design technology such as virtual platforms should not be considered as a hardware development tools, a firmware development tools, a verification environment or a network simulator. True system-level technology will scale beyond a specific engineering task. It will be an infrastructure throughout the electronic product’s life cycle and across the electronic product supply chain.
So what comes to your mind when you talk about “Systems”?
Multicore Expo highlights continued progress. March 23, 2009
Posted by marcseru in Uncategorized.1 comment so far
Multicore Expo took place last week. If you are not famliar with the event, go and visit http://www.multicore-expo.com/. It is a great way to get some input on what is going on with multicore trends and technologies.
This last event had a slew of presentation that covered multicore and provided some nice and interesting views. There is clearly still a long way to go, but from year to year we can clearly see that thoughts and direction are forming and becoming clearer.
On the site of the Expo you can find the presentations from last year. I am sure we will soon have access to the 2009 presentations. Feel free to drop a line, I am sure there will be a lot of good discussion on this topic this year.
Long Term Evolution Wireless – Get Ready for 1Gbit/s! February 6, 2009
Posted by jstahl09 in Uncategorized.add a comment
While some of you in the US may be scrambling to get your old Mickey Mouse antenna replaced for the switch over to HD-Digital TV, a group of companies owning a huge amount of spectrum is preparing to get you going with HD content on your mobile devices. The wireless operators around the world, the AT&Ts, Vodafones and Docomos, are working on putting the final touches on a new standard called LTE (Long Term Evolution), which will provide 100Mbit/s peak performance providing access to the internet at warp speed, even while travelling at 200 miles/hour in the Shinkansen train. Along with that goal for the standard, the base station providers and cell phone manufacturers have to reinvent their platforms. Simple scaling of the previous architectures does not work.
Where does this new standard put the biggest pressure? It’s the semiconductor companies in the wireless space. They are redoing their architectures to deliver the scalability and much higher speeds as compared to the previous generation (HSPA), which is deployed in the market today. Even if you are not into wireless design, by taking a look at the required 5-10x performance delta gives you an idea of the design challenge (Peak data rate 100Mbps vs. 14 Mbps today, latency 5 ms vs. 50 ms today, broadcast data rate improvement 8x)
What does it mean for the design? Most design teams would be tempted to design a lot more optimized signal processing hardware, but by the time they would be done with that, the standard will be moving towards LTE-Advanced (1Gbit/s peak rate). Also, the flexibility of the operation of LTE does not lend itself to fixed architecture. So, what should you expect to happen in these leading-edge design teams? They will use processors wherever possible. Standard cores, customizable cores and dedicated cores will be used to differentiate their architecture in terms of power and performance. This is multi-design at its finest and, of course, with a huge amount of design challenges for the performance optimization of the architecture and the software.
Is this all worth it for the semiconductor industry? Well, the operators certainly believe so. According to ABI Research’s senior analyst Nadine Manjaro, “ABI Research believes that NTT will also deploy LTE in Japan in 2009. We forecast that by 2013 operators will spend over $8.6 billion on LTE base station infrastructure alone. For operators that have already deployed 3G networks, LTE will be a key CAPEX driver over the next five years.”
Interested in the most challenging designs around the world? Then quickly learn about LTE!