jump to navigation

A rant: Will the agent of change for SW stand up? July 29, 2009

Posted by systemleveldesign in Uncategorized.
add a comment

 How many time have I heard now that software is the growing problem, that software is the savior for the traditional EDA companies, that the software teams are out growing the hardware team, yet the software tools budget are not, …

This week design automation conference is no different; you can hear it from the CEO of the EDA companies to their soldiers, from the executive of semiconductor companies to their soldiers, from the analyst to the press, to even the taxi driver in San Francisco. Software is the problem; we need to do something about it!

But yet when it comes to doing something very little of these players are effectively executing on it. Everything is very much about RTL, optimization of the hardware, etc. The true software solutions are few. The first thing we hear from customers is “yes it is about the software, but what about my hardware problem”. The second is from the companies (trying to protect their legacy) “yes, software is the problem, but let me explain to you how what I have today solves the problem”

Well let’s be realistic, if what exists today from these companies and what is being used by the engineers was solving the problem, then there would be no problem!

Addressing the software problem requires the willingness to change. To change the development process, to retool, to be willing to say I want to invest in these new technologies, to be willing to look at the product life cycle management, to stand up and say “Yes there is a software problem and I will not wait for the storm to pass before I fix it”, “Yes there is a software problem and I will lead my organization to make this difficult but necessary change”.

Are you this engineer, this manager, this executive that will say yes and drive the change? If not get out of the way, the wave is coming anyway.

NASCUG Does It Again at DAC July 28, 2009

Posted by systemleveldesign in Uncategorized.
Tags: , , , , , ,
add a comment

The North America SystemC User Group (NASCUG) once again held it’s popular SystemC user group meeting at DAC. Attendance was good, with over 100 people attending to hear the OSCI update, education on SystemC standards, and technical papers from users on the application of SystemC, and to participate in an open Town Hall meeting style discussion.

Interestingly, many of the attendees were attending NASCUG for the first time. According to the innovative live survey “clicker” response system, 43% were first time attendees. That said, the majority of the attendees considered themselves intermediate, advanced, or power users of SystemC. Before diving into the papers, OSCI announced progress in the TLM WG, AMS WG, and Synthesis WG activities (read more at www.systemc.org) and gave updates on all WG activities, including the new Configuration, Control, and Inspection (CCI) WG.

Technical presentations covered modeling, software development, architecture design, and verification.  For the complete agenda and access to the presentations, visit www.nascug.org.

Proven! ST-Ericsson Virtual Platform usage presentation at DAC 2009 July 28, 2009

Posted by marcseru in Uncategorized.
Tags: , , , , , ,
add a comment

I listened today to the presentation from Olivier Mielo, a software engineer from ST-Ericsson.  It was a very exciting presentation on why virtual platform will be the way to move forward to address the curretn challenges of software development.

In his presentation Olivier talked about the use of a SystemC virtual platform to develop and validate a UMTS layer1 protocol stack. Here are some of the great points that he made.

He first explained the design challenges which included:

  • Needing a “Software centric” approach to address multiple use cases
  • Develop and validate layer 1 (layer 2/3 ) software covering HSXPA protocol while keeping up and running on the new architecture existing layer 1 UTMS R.99 software
  • Reduce the time to get the “final” solution and be confronted to test approval in front of base station as soon as possible as well as be in time for the mass production with customers

He then explained why traditional methods such as FPGA protoypes or stand alone proprietary simulation would not work. Here are his points on FPGA:

  • Need a new net list each time the architecture is changing
  • Not so easy to spread complex design over many FPGA
  • Allows partial visibility on the system through JTAG
  • Difficult to debug multi-cores software without breaking real time conditions

They decided to deploy virtual platforms using CoWare technologies and expertise. After explaining their deployment and use, the following conclusion were reached:

  • Virtual Platform was available to start software development within a month
  • Software development was shorten and completed before RTL
  • Early system integration and validation was done (They had a full Virtual Platform test bench with the base station tester from ANITE integrated)
  • The powerful debug support made their work much easier and reduced the project risks.

You can listen to his presentation by visiting the CoWare booth #3665 at DAC in San Francisco. There will be replays during the rest of the week.

Big Crowd for CoWare ESV DAC Exhibitor Forum Session July 28, 2009

Posted by systemleveldesign in LTE, system design.
Tags: , , ,
add a comment

Presenting at the first exhibitor forum session at DAC, during some difficult economic times, is always somewhat of a risk. It wasn’t clear how big an audience we should expect. Standing in front of the exhibitor forum seating area, I was waiting with my co-presenters without high hopes. The first 5 people started to get seated 20 minutes before the presentation. And eventually attendees filled the seats … I shouldn’t have worried. Over 100 persons attended our presentation on Addressing the Design Challenges of ARM-Based LTE Mobile Phone Designs Using System Virtualization. Together, with IP model partners ARM, Ltd. and Carbon Design Systems, CoWare explained how CoWare Electronic System Virtualization solutions address the design challenges that hardware and software engineers working on LTE mobile phone designs are facing. Based on the questions from the audience, it is clear that standards like SystemC are very important and that the complexity of their designs demand a fast and flexible solution for multi-core designs.

These items are exactly some of the key strengths of CoWare Electronic System Virtualization solutions. As recent customer success stories prove, not only has CoWare been consistently leading the industry with standards-based ESV solutions, customers are choosing CoWare ESV solutions for their complex multi-core designs. Since this type of design is becoming the de facto standard for today’s electronic devices, virtualization needs will only grow.

Looking back at the audience for the electronic system virtualization exhibitor forum session, interest in system virtualization is already growing and people are more than ever looking at CoWare to help pull in design time, lower costs and enable the supply chain. It is promising to be an interesting DAC.

Show me the customer experience! July 25, 2009

Posted by marcseru in Uncategorized.
add a comment

I always find interesting the slew of product announcements around industry tradeshow. DAC is no different, every supplier comes out with the latest and greatest thing, hoping to make a splash and win with his technology. Over the years, the targeted audience has probably figured out that there was not that much into it other than setting up their route to see demos and be littered with the marketing superlatives used to describe the products.

For this year CoWare decided to take another approach by bringing to the audience the Electronic System Virtualization customer experience. First through announcement: last week we announced that NXP Successfully Deploys Integrated CoWare and ARM Cortex Processor-based Solution in its Virtual Prototyping Environment;, and there is more to come; second and probably more important through actual customer presentation at our DAC Booth.

The presentations will include:

  • Developing Advanced Wireless Technologies Using CoWare by Interdigital
  • Using a Virtual Platform to Develop and Validate a UMTS Layer1 Protocol
    Software Stack by ST-Ericsson
  • Achieving Optimal Cost-performance Balance in Advanced Wireless Modem
    Chipsets using Stochastic Simulation by Motorola

Information and registration for these presentations can be found at http://www.coware.com/news/dac09.php

Plan to attend!

What is “Electronic System Virtualization”? July 25, 2009

Posted by marcseru in Uncategorized.
add a comment

For people interested in electronic system virtualization, the term virtualization itself may be confusing. There is indeed a lot of different use for the term virtualization. The most well-known one being of course the one made famous by the IT community and companies such as VMWare. As to clarify some of the discussion on this blog, let me define what we mean by Electronic System Virtualization.

We have to consider two elements “Electronic System” and “Virtualization”: “Electronic System” refers to electronic products ranging from processor core and chip to hardware boards, devices (for example a mobile phone, a printer, a router, a medical device, …) and even network of devices (for example a car with a network of ECU). As a result “Electronic System Virtualization” is the emulation/simulation of such electronic systems. It should be noted that how the electronic system is virtualized depends on the use one would have to the virtualization technology. For example one could imagine a virtualization enabling interconnect and memory performance optimization where the interconnect and memory subsystem of the hardware will be represented with a high level of accuracy while traffic generators are used as the electronic system test environment. Another one could imagine a virtualization enabling the development of software, which would define the virtualization as a view of the hardware relevant from a programmer’s perspective only. There are many use cases for Electronic System Virtualization and each use may have specific requirement with regard on how it needs to be virtualized.

So why should you care about electronic system virtualization?  Electronic systems complexity is increasing. The costs for defining, developing and deploying such systems have increased significantly. Companies need to have the system right the first time while reducing costs and enabling their go-to-market. “Electronic System Virtualization” provides the infrastructure (tools and methodologies) to address these challenges.

Do you have experience with “Electronic System Virtualization”. Let us know what you think.

“Systems” is an overloaded term July 25, 2009

Posted by marcseru in Uncategorized.
add a comment

When someone talks to you about “systems” what image or word comes to your mind? Well, this is obviously a loaded question and at the end of the day the answer depends on who you are.

Having dealt with people selling hardware development tools to semiconductor companies, I would say that most of the time the first word out of their mouth would be “System-on-a-Chip”. And there is nothing wrong with that. Having also dealt with people selling software development tools, their response would probably be more along the lines of the full hardware board or the electronic product being designed. Finally, having also dealt with large electronic product companies, they would definitively argue that the system is much more than all of the above. An automotive OEM will most likely tell you that the car is the system, a networking company will argue that the system is the network and the multitude of components making it. 

So where am I going with this … When we use the word “systems”, it is important for the electronic product industry and its ecosystem to have an open mind. In particular, when it comes to system-level design, we always need to understand the perspective of the audience. System-level design technology such as virtual platforms should not be considered as a hardware development tools, a firmware development tools, a verification environment or a network simulator. True system-level technology will scale beyond a specific engineering task. It will be an infrastructure throughout the electronic product’s life cycle and across the electronic product supply chain.

So what comes to your mind when you talk about “Systems”?

2009 a Year of Changes: Wind River & Intel June 8, 2009

Posted by marcseru in Embedded, software.
add a comment

In January, I posted a short entry titled “Welcome 2009: A Year of Change”, well last week clearly showed us some changes in the embedded world when Intel announced that they would acquire Wind River. This was a major news for the embedded market on the software which had been asleep at best over the past year.

The questions on everybody’s mind are now: Why did Intel do it? Who will benefit? How will this change the semiconductor landscape, the embedded software landscape? What about ARM vs. Intel? and so on. Well, I will not try to answer these questions. This would most likely be a presumptuous task. All these companies affected directly or indirectly have a lot of brain power that obviously have thought about their strategy and how this announcement will impact them and be reflected in the future. Several companies are also probably adapting their strategies to respond and adapt. Some will see it as an opportunity, some will see it as a threat, some will take a technology view, some will take a financial view, some will not care (although I doubt it to be the case for this last one).

So rather than give an extended opinion that may be short sighted or incorrect given that I am not involved in these companies strategy directly, I thought I would point to some articles and blogs on the topic.and let anyone forge their own opinion.

Here is an Information Week posting, asking the question Will Intel be Successful and is Wind River the right company to help?

In an EEtime blog entry, the question is raised about the why of this acquisition and parallels to other semiconductor companies acquiring embedded software companies.

Other entry include: eWeek posting, Internet News posting, Razor Edge blog post, ..

There are obviously several blogs and articles that have been published last week on the topics and a Yahoo, Bing or Google search will lead you to them … The conclusion most of the time is the future will tell what will happen and if this was a right acquisition. The fact is that it will change the way the embedded market is operating. Let me add some question to the huge lists of existing ones:

  • Should Wind River traditional competitor (Monta Vista, Green hills, …) worry or cease an opportunity with this announcement?
  • How will semiconductor competitors of Intel adapt? ARM?
  • How does it affect the Microsoft/Intel relationship?
  • Apple acquired PA Semi, will Microsoft also move in this direction? Will Intel provide a more complete Chipset/software stack solution?
  • Is this the start of an acquisition and merger wave including semiconductor, IP, embedded software and even electronics companies?
  • Is this an indication of the operating systems being commoditized into the chip itself?

There are more questions than real concrete answers right now. I would be happy to add more questions to the list, so feel free to comment and provide your feedback.

To conclude, the financial world also does not believe this may be a done deal … looking at the post “Wind River: A Hope for More Money“. … Welcome to 2009, a year of changes.

Multicore Expo highlights continued progress. March 23, 2009

Posted by marcseru in Uncategorized.
1 comment so far

Multicore Expo took place last week. If you are not famliar with the event, go and visit http://www.multicore-expo.com/. It is a great way to get some input on what is going on with multicore trends and technologies.

This last event had a slew of presentation that covered multicore and provided some nice and interesting views. There is clearly still a long way to go, but from year to year we can clearly see that thoughts and direction are forming and becoming clearer.

On the site of the Expo you can find the presentations from last year. I am sure we will soon have access to the 2009 presentations. Feel free to drop a line, I am sure there will be a lot of good discussion on this topic this year.

Long Term Evolution Wireless – Get Ready for 1Gbit/s! February 6, 2009

Posted by jstahl09 in Uncategorized.
add a comment

While some of you in the US may be scrambling to get your old Mickey Mouse antenna replaced for the switch over to HD-Digital TV, a group of companies owning a huge amount of spectrum is preparing to get you going with HD content on your mobile devices. The wireless operators around the world, the AT&Ts, Vodafones and Docomos, are working on putting the final touches on a new standard called LTE (Long Term Evolution), which will provide 100Mbit/s peak performance providing access to the internet at warp speed, even while travelling at 200 miles/hour in the Shinkansen train. Along with that goal for the standard, the base station providers and cell phone manufacturers have to reinvent their platforms. Simple scaling of the previous architectures does not work.
 
Where does this new standard put the biggest pressure? It’s the semiconductor companies in the wireless space. They are redoing their architectures to deliver the scalability and much higher speeds as compared to the previous generation (HSPA), which is deployed in the market today. Even if you are not into wireless design, by taking a look at the required 5-10x performance delta gives you an idea of the design challenge (Peak data rate 100Mbps vs. 14 Mbps today, latency 5 ms vs. 50 ms today, broadcast data rate improvement 8x)
 
What does it mean for the design? Most design teams would be tempted to design a lot more optimized signal processing hardware, but by the time they would be done with that, the standard will be moving towards LTE-Advanced (1Gbit/s peak rate). Also, the flexibility of the operation of LTE does not lend itself to fixed architecture. So, what should you expect to happen in these leading-edge design teams? They will use processors wherever possible. Standard cores, customizable cores and dedicated cores will be used to differentiate their architecture in terms of power and performance. This is multi-design at its finest and, of course, with a huge amount of design challenges for the performance optimization of the architecture and the software.
 
Is this all worth it for the semiconductor industry? Well, the operators certainly believe so. According to ABI Research’s senior analyst Nadine Manjaro, “ABI Research believes that NTT will also deploy LTE in Japan in 2009. We forecast that by 2013 operators will spend over $8.6 billion on LTE base station infrastructure alone. For operators that have already deployed 3G networks, LTE will be a key CAPEX driver over the next five years.”
 
Interested in the most challenging designs around the world? Then quickly learn about LTE!