A group of French industry and government researchers has demonstrated how to integrate end-to-end scheduling analysis using Autosar 4.0, the latest release of the emerging Automotive Open System Architecture. Scheduling analysis is a timing verification method to ensure that a real-time system can meet its timing requirements. The lack of this critical capability has been a significant limitation of the Autosar industry standard for embedded automotive software systems.
“Our work shows OEMs and suppliers that they can use scheduling analysis to verify the timing constraints of Autosar systems at the model level,” said coauthor Stefan Kuntz, project manager at Continental Automotive France/Germany, one of the world’s top five automotive suppliers, which conducted the research in conjunction with CEA LIST, the French government Laboratory of Applied Research on Software-Intensive Systems.
The research appears in a paper presented at the March 2011 14th IEEE International Symposium on Object/Components/Service-Oriented Real-Time Distributed Computing (ISORC 2011).
Autosar Industry Consortium and Standard
Under development since 2003, Autosar is an international industry effort to define a model-based architecture with standard interfaces. The model aims to help reduce the number of electronic control units (ECUs) in an automobile—50 is the current average—and simplify software development and integration. The consortium motto, “Cooperate on standards; compete on implementation,” reflects a fundamental change in automotive design from an ECU/code-based approach to a function-based/model-driven approach.
Earlier efforts to perform model-level scheduling analysis for Autosar had failed for lack of a means to express timing information. Autosar 4.0, released in December 2010, added timing extensions that car manufacturers and their suppliers can use to specify the timing behavior at different Autosar system levels. However, the extensions provide only a language to use in expressing timing-related information, said Saoussen Anssi, a software development engineer at Continental Automotive France and PhD student supervised by CEA-LIST. “Users must still characterize an analyzable system model to test,” she explained.
Anssi is the lead author of the ISORC research paper, in which she and her colleagues describe the basic features of an analyzable Autosar model as required by scheduling analysis. They illustrate their approach in a cruise-control system case study and present results that confirm its feasibility.
The function-based, model-driven approach requires a change in the way scheduling analysis is used to verify real-time system constraints.
“The old ECU-based approach expressed timing constraints as simple deadlines on operating system tasks,” Kuntz explained. “It performed scheduling analysis to calculate the response time of each task and compare it with the deadline. But a function-based architecture like Autosar expresses timing constraints as end-to-end deadlines on function flows that might be distributed over many ECUs.”
This capability lets all involved parties assess the dynamic behavior of vehicle functions much earlier than in the past, Kuntz said, which means significant cost savings.
For More Information
The research paper, “Enabling Scheduling Analysis for Autosar Systems,” is available to IEEE Computer Society Digital Library subscribers at http://doi.ieeecomputersociety.org/10.1109/ISORC.2011.28.
Published Date 8/12/11 11:48 AM
Siemens has developed a new algorithm for traffic control systems that dynamically changes toll charges for special lanes based on traffic conditions. The system, known as “Fast Lane” has been installed on a highway connecting Jerusalem and Tel Aviv in Israel. During peak traffic, some toll roads have special lanes. A fee is charged to vehicles using these in an effort to get people to car pool or use public transportation. The idea for the fee for these lanes is to set them to ensure their use and prevent congestion. The Siemens Mobility system consists of induction loops in the road surface that track vehicle numbers. The algorithm uses that data to calculate the toll fees down to the minute, which they say leads to evenly distributed traffic density on the special lane. If traffic is lighter, for example, the fee drops. The package also has a system for billing. Siemens says its “Fast Lane,” which is 12 kilometers long, reduces the typical 30-minutes-to-an-hour drive during peak commute to 12 minutes. (PhysOrg.com)(Siemens)
Published Date 8/12/11 11:43 AM
CERN has announced an effort to augment its Worldwide Large Hadron Collider Computing Grid with distributed computing by using volunteers’ home computers. Participating citizen scientists can contribute their unused computing power to help the researchers simulate the origin of the Universe and find subatomic particles, including aiding them in the search for the Higgs boson particle. Specifically, particle collisions will be simulated on the home computers; then, those results will be compared with LHC data and any discrepancies will be able to uncover “any sign of disagreement between the current theories and the physical Universe,” according to the LHC@home 2.0 website. “Ultimately, such a disagreement could lead us to the discovery of new phenomena, which may be associated with new fundamental principles of nature.” (Discovery News Daily)(BBC)(LHC@home 2.0)
Published Date 8/12/11 11:40 AM
A team of Rice University researchers has successfully produced high quality graphene from a single box of Girl Scout cookies. The researchers had previously made graphene using table sugar. At a group meeting, James Tour, professor of chemistry, computer science, and mechanical engineering and materials science, mentioned that it could be grown from any carbon source. Girl Scout cookies were being shared and, as a result, were offered as an acid test of the concept. The cookie of choice: shortbread. Tour’s group has now described and shown that a single atom sheet of graphene can be made from almost any carbon source -- including a cockroach leg, chocolate, polystyrene, even miniature dachshund poop. The researchers made the graphene using carbon deposition on copper foil. The graduate student researchers determined that, based on the commercial rate for pristine graphene, which was US$250 for a two-inch square at the time, a single box of cookies could create US$15 billion worth of graphene. Members of Girl Scout Troop 25080 of Houston were invited to the lab to see the process. The work was published in ACS Nano. (SlashDot)(Rice University)(ACS Nano)
Showing 1,831 - 1,835 of 4,575 results.