Mar 10, 2008

IBM, Hitachi team up to advance chip research

IBM and Hitachi researchers will try to accelerate the miniaturization of chip circuitry by researching at the atomic level for 32nm and 22nm semiconductors

IBM and Hitachi are expected to announce a research agreement on Monday in which the companies will collaborate to improve semiconductor technology, including shrinking the features on silicon chips.



Researchers from the companies will try to accelerate the miniaturization of chip circuitry by researching at the atomic level for 32-nanometer and 22-nanometer semiconductors. Making chip circuits smaller should allow computing devices to deliver power savings and performance gains. It will also make manufacturing more efficient, IBM said.

By combining research capabilities and intellectual property, the companies also hope to reduce the costs of developing advanced chip technologies, IBM said.

The tie-up with Hitachi is not linked to the Cell processor, which is the result of a separate development partnership between IBM, Sony, and Toshiba, IBM said. Though IBM and Hitachi work together on enterprise servers and other products, this is the first time they are collaborating on semiconductor technology.

Engineers from the companies will conduct research at IBM's Thomas J. Watson Research Center in Yorktown Heights, New York, and at the College of Nanoscale Science and Engineering's Albany NanoTech Complex, also in New York. Though the research does not apply directly to manufacturing, it could contribute to IBM's manufacturing processes as they relate to future silicon devices, IBM said.

Financial details of the two-year agreement were not disclosed. IBM officials declined comment on when products resulting from the research would hit the market.

Chip makers such as IBM, Intel, and Advanced Micro Devices are constantly upgrading their manufacturing technologies to shrink chips. Intel began switching its manufacturing process to 45-nanometer chips last year, and AMD is scheduled to make a similar move later this year. Intel recently said it hopes to shrink the features on its chips to 22nm by 2011.

A nanometer is equal to about one billionth of a meter. In chip manufacturing, the figure refers to the smallest features etched onto the surface of the chips. As chip makers build smaller and smaller transistors, they are dealing with features that are in some cases just a few atoms thick.

IBM already has a strong profile in advancing semiconductor technology. It is developing silicon nanophotonics technology, which could replace some of the wires on a chip with pulses of light on tiny optical fibers for quicker and more power-efficient data transfers between cores on a chip. It is also working with U.S. universities to develop carbon nanotubes, smaller transistors that could deliver better performance than current transistors.

 

Read More...

Transmeta founder Ditzel to join Intel

As they say, if you can't beat 'em, join 'em.

Dave Ditzel, co-founder of chip company Transmeta, is joining Intel's Digital Enterprise Group to work with Steve Pawlowski, one of Intel's top architects. An Intel representative confirmed a report put out over the weekend by The Register that Ditzel would be joining forces with his one-time enemy.



Transmeta was way ahead of its time in pursuing a low-power microprocessor strategy, attempting to break into the notebook PC and blade server markets with its Crusoe chip. The trouble was, Crusoe's low-power design came at the expense of performance, and manufacturing issues--combined with Intel's swift embrace of low-power tactics--killed Transmeta's chances of ever making a dent into Intel or AMD's market share.

The company survives these days on its patent portfolio, licensing some of its low-power techniques and designs and filing lawsuits. Transmeta recently settled claims against Intel for $250 million.

Ditzel left Transmeta about a year ago. Intel declined to elaborate on exactly what he would be working on with its DEG group, but here's a bit of wild speculation to kick off a Monday morning: a server-grade version of the Atom processor?

 

Read More...

Chinese Hackers Worry Pentagon

WASHINGTON (Reuters) - China is developing weapons that would disable its enemies' space technology such as satellites in a conflict, the Pentagon said in a report released last week.



The report also said "numerous" intrusions into computer networks around the world, including some owned by the U.S. government, in the past year seem to have originated in China.

The assessments feature in an annual report on China's military power by the Pentagon for the U.S. Congress. Beijing routinely criticizes the report, saying it unfairly portrays China as a military threat when it is committed to peace.

David Sedney, a top Pentagon China specialist, said there was no call for U.S. alarm over China but repeated a frequent U.S. complaint that Beijing has not made clear the reasons for its rapid military modernization and spending growth.

"I think the biggest thing for people to be concerned about really is the fact that we don't have that kind of strategic understanding of the Chinese intentions," said Sedney, deputy assistant secretary of defense for East Asia.

"That leads to uncertainty," he said, briefing journalists at the Pentagon on the latest report.

China has posted a string of double-digit percentage rises in military spending in the past decade and many analysts say Beijing understates the amount it spends.

However, even the highest estimates of the true figure are dwarfed by U.S. defense spending.
Space and Cyberspace

Sedney said China's activities in both space and cyberspace were areas of concern.

"China is developing a multi-dimensional program to limit or prevent the use of space-based assets by its potential adversaries during times of crisis or conflict," the report said.

The report said the Chinese People's Liberation Army had developed a range of weapons and jammers to prevent an enemy from using space-based systems such as satellites.

"The PLA is also exploring satellite jammers, kinetic energy weapons, high-powered lasers, high-powered microwave weapons, particle beam weapons, and electromagnetic pulse weapons for counterspace application," it said.

It noted that China destroyed a defunct weather satellite in a test in January 2007, even though the incident also was included in last year's report. U.S. officials repeatedly have raised the shootdown as an issue of great concern.

"We continue to ask the Chinese to sit down and talk to us about that test and they haven't," Sedney said.

The United States blew apart a defunct satellite of its own with a missile from a Navy ship last month. The Pentagon said that was done purely to prevent potential harm to people.

Under the heading "Cyberwarfare Capabilities," the report stated that intrusions apparently from China into computer networks used "many of the skills and capabilities that would also be required for computer network attack."

It said it was not clear if the intrusions were carried out or backed by the Chinese military but "developing capabilities for cyberwarfare is consistent with authoritative PLA writings on this subject."

Last March, China announced a 17.8 percent rise in military spending to 350.92 billion yuan, or about $45 billion, for 2007. The Pentagon report said the true figure could be between $97 billion and $139 billion.

The Bush administration last month requested $515.4 billion for the Pentagon in the next U.S. fiscal year. That figure does not include extra spending for the wars in Iraq and Afghanistan or nuclear weapons programs run by the Department of Energy.

 

Read More...

Mar 3, 2008

Microsoft cuts retail Vista prices

Microsoft will slash prices of Windows Vista by up to 50 percent, mainly in developing countries; U.S., Europe will see smaller cuts, if any

Microsoft on Thursday said it plans to slash prices for retail copies of Windows Vista up to almost 50 percent for certain editions in poorer countries, in order to boost sales that one analyst said have failed to meet expectations.

But many customers, especially those in wealthier countries such as the U.S. or Europe, may only see additional discounts as small as 3 percent -- or none at all -- depending on which of Vista's four consumer versions they are interested in.

[ Analysts say: Price cuts don't get to heart of Vista's problems ]

"The vast majority of our retail customers -- especially those in developed markets -- may not notice anything different from the promotions they've already seen in their region," according to a spokeswoman. "This is really about formalizing promotions we've run with several partners already to continue to grow our retail business."

In a Q&A interview posted on the PressPass section of Microsoft's Web site, Brad Brooks, the new corporate vice president for Windows consumer product marketing, said that the cuts will arrive "with the retail release of Windows Vista Service Pack 1 later this year," though some markets will see reduced prices sooner through promotions such as with Amazon.com in the U.S.

In developed markets, according to Brooks, Microsoft is mostly cutting prices for retail upgrade versions of Windows Vista Home Premium and Ultimate. "In emerging markets, we are combining full and upgrade Home Basic and Home Premium versions into full versions of these editions and instituting price changes to meet the demand we see among first-time Windows customers who want more functionality than is available in current Windows XP editions. "In addition, we are also adjusting pricing on Windows Vista Ultimate in emerging markets to be comparable to price changes developed market customers will see."

"I think this is a smart strategic move," said NPD Group analyst, Chris Swenson. "Vista hasn't hit their initial expectations."

While Microsoft has sold more than 100 million Vista licenses in its first year -- a figure which excludes the tens of millions of Windows licenses sold to corporations -- more than 80 percent of those licenses have been sold to PC makers to install on new PCs, according to Swenson.

Retail copies of Vista sold through online and brick-and-mortar stores make up most of the rest, Swenson said. They are mostly bought by consumers upgrading their existing computers, as well as some do-it-yourselfers assembling their own PCs, he said.

Microsoft can afford to make the discounts, since it makes much more money per retail copy of Vista sold compared to OEM licenses sold to a PC manufacturer.

In fact, Microsoft has previously done just that, offering a flock of retail discounts at Vista's launch a year ago.

But first-week retail Vista sales in the U.S. were off 60 percent from those of its predecessor, Windows XP, according to NPD.

U.S. retail sales for all versions of Windows in 2007 were up 41 percent from 2006, according to NPD. (That figure sounds less impressive when one considers that 2006 Windows sales were actually down 18 percent from 2005.)

In terms of the mix in the U.S., half of the copies of retail Vista sold last year were for the Home Premium edition, which sold for about $174, according to NPD. The pricey Ultimate edition, which sold for an average $274, made up 24 percent of unit volume.

Swenson says one reason retail Vista sales are weaker than XP's is because of the many years -- five -- between its release and XP's. By contrast, XP was released only one year after Windows 2000 and ME. That meant that consumers who bought a new PC with 2000 or ME would have been more likely to upgrade it with XP. Not so for consumers who bought a new XP PC three or four years ago; machines of such comparatively advanced age are unlikely to have been upgraded to Vista's requirements.

Moreover, hardware price points have fallen another 25 percent since XP's release, according to Swenson.

Finally, running Vista with its full Aero desktop turned on requires fairly powerful PC hardware. All of these factors combine to make it more attractive for consumers to buy a whole new computer with Vista on it than to upgrade an existing PC, he said.

Swenson doesn't think Microsoft's move is a tactical attempt to combat ongoing negative publicity of Vista, including a lawsuit alleging that 'Vista Capable' PCs were not truly Vista capable.

"I doubt the two are tied," he said. Microsoft "really wants to help spark Vista sales, though I don't see it taking off like a rocket like the way Office did after its price was cut."

He also doesn't see a link between the price cuts and the failure of Microsoft's Anytime Upgrade program, which let consumers upgrade their edition of Vista by purchasing a digital key from Microsoft online. Microsoft terminated the program last month.

"It was probably ahead of its time, and thus not successful, and so they got rid of it," he said.

 

Read More...

You Work Harder Than Your Boss: Survey

The vast majority of U.S. workers say they work much harder than the president of their firm, according to a new poll from Monster.com.



NEW YORK - The vast majority of U.S. workers say they work much harder than the president of their firm, according to a new poll from employment advertising company Monster.

A full 77 percent of respondents argued they toiled longer and harder than the occupants of the corner office, the survey found.

Unscientific by the company's own admission, the poll findings speak nonetheless to a general sense in American society that the higher echelons have it too easy, getting paid a whole lot more form doing far less.

"Nowadays, with the ratio of CEO pay to the average worker exploding, feelings of disenfranchisement from not being compensated fairly are much more likely," said Steven Blader, assistant professor of management and organization's at New York University's Stern School of Business.

The results were based on 5,369 votes cast by website users on the Monster homepage. Only one vote per user is counted toward the final tabulation.

(Reporting by Pedro Nicolaci da Costa and Ellen Freilich)

Copyright 2008 Reuters.

 

Read More...

Intel chooses 'Atom' name for new chips

Intel announced Sunday that is has chosen the name "Atom" for a new family of ultra-small chips.



The "Atom" moniker will be applied to a family of chips with two members that are expected to be released later this quarter. One--previously know as Silverthorne--is a low-power mobile processor destined for the next generation of mobile Internet devices. It incorporates a new low-power state, allowing it to essentially shut down in between processing tasks and limit power consumption.

The other, code-named Diamondville, is a single-core processor for ultra-low-cost laptops. Intel refers to the low-cost notebook design as "netbook" and estimates the pricing for these devices will go as low as $250. Diamondville is a tiny 45-nanometer processor that employs a simpler design than standard Intel processors,

Intel also rebranded its Menlow chip as the Centrino Atom--a low-power companion chip with integrated graphics, a wireless radio, as well as thinner and lighter designs.

 

Read More...

Feb 27, 2008

Microsoft Readies Internet Explorer 8 Beta

Microsoft says enhancements planned for IE 8 include improved support for Ajax programming and better security.



Microsoft (NSDQ: MSFT) plans to launch a test version of the next major edition of its Internet Explorer Web browser by the end of June at the very latest, a company official said on a blog post Tuesday.

"A beta version of Internet Explorer 8 will be released in the first half of 2008," said a developer writing on Microsoft's Explorer blog.

Currently in development, Explorer 8 hit a major milestone in December when it passed the "Acid2 Face" test, which measures the extent to which a browser conforms to a series of widely used Web standards. Among the enhancements planned for IE 8 are improved support for Ajax programming and better security, Microsoft has said.

Microsoft also needs to ensure that Explorer 8 will be compatible with Web sites designed for earlier versions of the software. A number of corporate IT departments, as well as the federal Department of Transportation, have shied away from IE 7 -- released in October 2006 -- due to such concerns.

Explorer 8 is just one of numerous products that Microsoft plans to release in the months ahead. On Wednesday, the company plans to formally launch Windows Server 2008, SQL Server 2008 and Visual Studio 2008 at an event in Los Angeles.

Later this year, Microsoft will release Small Business Server 2008. It's also planning to ship a public beta of its Silverlight 2 Web presentation technology in the coming weeks.

All of this comes at a time when Microsoft is engaged in a high stakes bid for its Internet rival Yahoo. Microsoft has offered more then $40 billion for Yahoo, but so far the board of directors at the Web portal have rejected the pitch.

Some analysts have questioned whether Microsoft would be able to keep all of its product development schedules on track while at the same time attempting to close and integrate a multi-billion dollar acquisition.

By Paul McDougall
InformationWeek








 

Read More...

WorldWide Telescope peers into Big Dipper

Microsoft on Wednesday gave TED conference-goers--an audience typically filled with stars like Goldie Hawn or Forest Whitaker--a close-up of real celestial bodies with its new virtual telescope.



Microsoft demonstrated long-awaited software called WorldWide Telescope to an audience at the exclusive Technology Entertainment and Design conference in Monterey, Calif., a four-day confab that started Wednesday. It's unclear whether the demo of the astronomy technology made anyone in the audience cry like former Microsoft evangelist Robert Scoble, but the images (shown above) were certainly stellar.

WorldWide Telescope, similar to the sky feature in Google Earth but much more expansive, is a virtual map of space that features tens of millions of digital images from sources like the Hubble telescope and the Sloan Digital Sky Survey, a project championed by missing Microsoft researcher Jim Gray (to whom Microsoft dedicated the WorldWide Telescope on Wednesday). From the desktop, the technology lets people pan and zoom across the night sky, zeroing in on the Big Dipper, Mars, or the first galaxies to emerge after the Big Bang. It also lets people call up related data, stories, or context about what they're seeing from sources online.

Harvard University astrophysicist Roy Gould, who demonstrated the telescope with Microsoft principal researcher Curtis Wong, said that that the technology holds promise for research and for humanity.

"The WorldWide Telescope takes the best images from the greatest telescopes on Earth...and in space...and assembles them into a seamless, holistic view of the universe," Gould, of the Harvard Center for Astrophysics, said at the conference.

"This new resource will change the way we do astronomy...the way we teach astronomy....and, most importantly, I think it's going to change the way we see ourselves in the universe."

Microsoft also unveiled a promotional site for the telescope project Wednesday, but the free technology won't be live until sometime this spring. Without the tears, several academics talk up the telescope in video on the site. Here is a sampling of the awe-struck sentiment: "It's the universe that you yourself can voyage through." "It's a magic carpet." "It's an example of where science and science education is going." "My hope is to have it on every kid's desktop."

Stefanie Olsen

 

Read More...

What's Inside Google Android?

Linux, Java, and a few more surprises. See what's under the hood of Google's upcoming mobile platform.



Android is Google's foray into the handheld OS realm. It follows a path trodden by -- among others -- Symbian's Quartz, the SavaJe operating system, and J2ME. In fact, one of Android's stated goals is to overcome some of J2ME's shortcomings. Whether or not Android succeeds, either at that specific goal, or in general, remains to be seen.

This article addresses a specific question: What is it like to work with the Android SDK? And to a lesser extent: What is under the Android hood? As these questions are answered, bear in mind that the version of the Android SDK I used was not in final form. Some of the problems described may have -- in fact, I hope will have -- been corrected by the time you read this. In addition, while Android development is supported on Linux, Mac OS X, and Windows, I did all my testing on Windows systems.
Inside an Android

Peel away Android's carapace, dig down to its marrow, and you'll find a Linux kernel. Libraries are a layer above, a variety of frameworks above that, and a final layer of applications sits on the top. The library layer is home to code for entities such as media processors for playback and recording of audio and video, the core of the Web browser, font rendering, and the SQLite relational database engine. The Android runtime also lives in the library layer.

Above the libraries reside frameworks, which are sets of reusable services and common components available to applications. For example, one sort of framework is a content provider, which is any service that handles the storage and retrieval of data. The application interface into the SQLite engine is a specific instance of a content provider.

Applications run at the top of the OS stack. Android will ship (assuming that it eventually does ship) with a set of core applications, including an e-mail client, a calendar, a Web browser, and more. And, of course, it is toward this topmost layer that all of the faculties of the Android SDK are directed.

When a developer writes an Android application, that developer codes in Java. The Java source is compiled to Java bytecodes, but -- to execute the application on Android -- the developer must execute a tool called dx. This tool converts Java bytecode to what is referred to as dex bytecodes. "Dex" is short for "Dalvik executable," Dalvik being the virtual machine that actually executes Android applications.

From a developer's perspective, Dalvik looks like a Java Virtual Machine, but strictly speaking, Dalvik is not a JVM. As stated above, Dalvik executes dex bytecode, not Java bytecode. And there are differences in the structure of Dalvik class files as compared to Java class files. Nevertheless, for all intents and purposes, building an Android application is really an exercise in building a peculiar sort of Java application.
The Android SDK

The Europa version of Eclipse is the preferred development platform for Android applications. In addition, you need at least a JDK 5 or JDK 6 installation to use the Android tools (the JRE that Eclipse typically installs is insufficient). Instructions on the Android site walk you through installing the Android Development Tools plug-in for Eclipse, and verifying the installation's correct operation by guiding you through the creation and execution of a quintessential "hello world" application.

However, you are not tied to Eclipse as your Android development system. The Android SDK does provide tools that let you use other IDEs in place of Eclipse. For example, the IntelliJ IDE is mentioned specifically in the Android documentation.

Hard-core developers will be satisfied to work solely with the collection of command-line tools that come with the SDK. For example, the activityCreator tool -- which is provided as a batch file for Windows, and as a Python script for Mac and Linux users -- will construct the framework for an Android Activity. (Activity is the Android equivalent of an application; more on this later.) Executing activityCreator will build skeletal Java files, create the Android project's required subdirectories, and build the necessary manifest XML files. The tool also creates an Ant script file for compiling the source and building the application. Once built, the application can be launched via the SDK's adb tool, the Android debug bridge.

Other command-line tools in the SDK include logcat, which outputs a log of system messages. Thanks to the stack trace provided by logcat, it is useful whenever an error occurs on the Android emulator. If you need deep analysis of what is going on in errant code, you can import a special Debug class into your application. This class provides methods for starting and stopping execution traces. When activated, Debug will log method calls to a trace file, which can be examined later with the toolkit's TraceView application. From within TraceView, you can view thread interactions, as well as examine execution paths. TraceView also shows the amount of time spent in each method, so you can use the tool as an execution profiler.

Finally, there is the Android emulator itself. When started, the emulator displays the skin of a hypothetical android device, complete with specialized faceplate buttons and QWERTY keyboard. It does its best to mimic an actual device, though there are understandable limitations (it cannot, for example, take incoming phone calls). The Android emulator runs a modified version of Fabrice Bellard's excellent open source simulation/virtualization environment, QEMU. Android's version of QEMU simulates an ARM processor, and on that processor executes the Linux OS.
Working with Eclipse

Once the Android Eclipse plug-in is installed, building an Android application is much like building any other application. The plug-in adds an Android Activity project to Eclipse's project templates tree. Start a new project, and the plug-in builds the foundational Java files, creates the necessary folders, and constructs skeletal resource files.

The Eclipse plug-in handles compilation, conversion to dex, launching the emulator, and downloading the application. Because writing Android code is writing Java code, the editor behaves as it would were you constructing an ordinary Java application. Resource files, which are written in XML, are easily managed by XML editors already available in Eclipse. Debugging is likewise supported from within Eclipse, and Android opens a debug perspective that anyone already familiar with Eclipse will be comfortable with.

Unfortunately, Android introduces a whole new lingo for developers to memorize. Roughly speaking, an application is an Activity. (The current documentation is only marginally helpful on this point, describing an Activity as "a single focused thing that a user can do.") Within an activity, you define one or more views. A view -- realized via the View class -- corresponds to an area on the screen, and manages the drawing and event trapping of its associated area. (So, for Java developers, a View is roughly equivalent to a Canvas.) Event handling is governed by the Intent class, which models an activity's intention to handle events of a given kind.

In short, be prepared to spend some time in the documentation matching what you already understand about GUI application development with the corresponding elements as Android calls them. The documentation is reasonably good on this matter. Nevertheless, as is typical, I found the provided example code to be far more useful.

Just before I began testing the Android SDK in mid-February, a new SDK was released (m5-rc14, to be exact). I installed that SDK (and Eclipse plug-in) on a 1GHz, 1GB Windows XP system. Though installation went smoothly, the emulator took on the order of 30 minutes to complete its boot process when an Activity was launched from within Eclipse. Other users on the Android message boards reported similar behavior, though the problem was by no means universal. The solution appeared to be faster hardware. Luckily, I had a more powerful machine on hand: a 3GHz processor with 2GB of RAM (running Windows 2000). I reinstalled Eclipse and the Android SDK on this faster system, and -- sure enough -- the emulator was up and running my trial application in about 30 seconds after the launch from Eclipse.

Now, 30 seconds -- though worlds better than 30 minutes -- is by no means a comfortable launch latency, particularly if you're stuck in a heavy execute-crash-debug-fix-execute cycle. And I'm not certain that delay can be reduced appreciably. Remember, when you start the emulator, you're starting the QEMU environment, followed by a booting of the Linux kernel, which has to fire up all the framework services. In other words, a lot has to happen just to get to the first bytecode of your target application.
Android on the March

Android is definitely a work in progress. If you want to try your hand at creating a significant Android application with the existing toolkit, I salute you. But be prepared for a challenge.

My biggest concern with Android is that I find nothing compelling in it that sets it apart from other handheld OSes. Though one might be tempted to point to the inclusion of the SQLite database engine as significant, I am unconvinced that an SQL-speaking relational database is the "killer feature" that will help Android succeed where other, similar handheld OSes have simply fizzled.

Nevertheless, Android has the weight of Google behind it. Whether that weight is sufficient to propel Android where other handheld OSes have not gone before is uncertain. For now, I'll simply say that Google has a lot of propelling to do.

Rick Grehan, InfoWorld
 

Read More...

Intel Core 2 Duo E8500 Processor Review - 45nm Wolfdale



Intel Dual-Core Processors Go 45nm - Wolfdale



When AMD released the quad-core Phenom processor series last November the industry was in shock by the low performance numbers and clock frequencies that Phenom was launched with. If that was not enough, AMD then had to deal with the TLB erratum number 298 that presented a BIOS workaround that fixes the issue for a large performance loss. Even during this bad news other headlines from November to February revolved around video cards, as both NVIDIA and ATI launched new series that both marked significant improvements over previous generations. With all the media focus going to the new video cards and how bad Phenom is doing, almost no one noticed the refresh of their Core 2 Duo processor lineup. The old 65nm Conroe dual-core processor that we have all come to know and love has been replaced by a new 45nm Wolfdale dual-core processor! The Intel Wolfdale based processors have the same technology benefits that LR has already talked about in previous articles, so if you don't know about High-k + metal gate transistors or Intel's lead-free technology you have some catching up to do.



With a die size of just 107mm2 and 410 million transistors, it is smaller than its predecessor the Conroe, as it had a die size of 143mm2 with 291 million transistors. The above pictures are not to scale, but they show what the layout of the dies look like. Most of the 119 million new transistors are for the larger 6MB L2 cache on the Wolfdale as the Conroe had just 4MB. Other transistors are dedicated to the new SSE4 instruction set and the super shuffle engine. The TDP rating (Thermal Design Power) for the chip impressively stays the same though at just 65W. For comparison, the fastest single desktop processor that Intel has to offer right now is the quad-core Intel QX9770, which has a TDP rating of 136W. All of the Intel dual-core Wolfdale series processors are rated at 65W TDP and have 6MB of L2 Cache.



Intel currently offers four Wolfdale processors and, as you can see from the table above, half multipliers are back in action as the Intel E8500 has a multiplier of 9.5. Of the four dual-core Wolfdale processors you might be wondering what the difference between the E8200 and the E8190 is as they have the same basic features. Basically, the E8190 lacks Virtualization and Trusted Execution Technology. Many people don't even know what virtualization is, so one can expect prices on the E8190 to eventually be less than that of the E8200. It should also be pointed out that some rumors are going around that Intel will have a Core 2 Duo E8300 with an 8.5x multiplier and an E8600 with a 10x mutliplier coming our later this year.



Today, we will be testing out the Intel Core 2 Duo E8500 processor against six other processors, but keep an eye on the the $269.99 Intel Core 2 Quad Q6600 Processor and $259.99 AMD Phenom 9600 Black Edition Processor as these processors are in the same price range.

 

Read More...

Feb 23, 2008

Is Silicon Valley the new Detroit for electric cars?

SAN JOSE, Calif.--Silicon Valley is sparking a revolution in alternative-fuel autos, but it may take awhile--too long perhaps--to effect change in Detroit, according to a panel of auto executives.



A group of electric and traditional carmakers spoke here Friday at the Joint Venture Silicon Valley conference about innovation, why alternative carmakers are attracted to the Valley, and whether nimble upstarts can overshadow the big Detroit automakers. The consensus was that Silicon Valley is commanding the attention of the auto world, whether it will dominate or not.

"We're not going to take over China or Detroit, but every carmaker has an outpost here and is watching what people are doing," said Felix Kramer, founder of nonprofit plug-in hybrid initiative CalCars. "This can be a real incubation area for new technology in automotive."

To be sure, Silicon Valley is rife with change when it comes to the merger of technology and autos.

Volkswagen, for example, recently funded Stanford University in order to develop a new car lab whose mission is to study "cutting-edge research in safety, comfort, and fun for the consumer driving the car," said Sebastian Thrun, while speaking at an artificial intelligence conference Thursday night. The lab, which will open later this year, will focus on new technologies such as computer-assisted driving--for instance, a car that could park itself. Eventually, self-driving or smart cars could help make driving more efficient and safe, Thrun said.

"When kids can drive themselves to soccer, and do away with the soccer parent, humanity will be better off," Thrun said.

Elon Musk's Tesla Motors, also based in Silicon Valley, is delivering its first production models of an electric two-seater roadster, for a price of nearly $100,000. It eventually plans to sell a four-door electric car for about half the price and then even more affordable models later.

Another Palo Alto upstart called Project Better Place, founded by former SAP executive Shai Agassi, recently announced that it will team with Renault and Nissan car companies, along with the Israeli government, to develop electric cars and electric-battery stations in that nation. It has raised $200 million to produce lithium-ion batteries and the facilities to recharge those batteries--and its cars are expected to be ready by 2011.

Google, based in Mountain View, also recently announced Recharge It, a project to convert hybrids to plug-in hybrids and test vehicle-to-grid technology, in which the vehicle's battery powers the electrical grid. Milpitas-based OEMtek is charging people $12,500 to convert their Toyota Prius into a more efficient car (getting 100 miles per gallon vs. 45 miles per gallon) with a larger battery.

San Dimas-based AC Propulsion, which makes an all-electric Scion eBox for $70,000, is also opening up an office in Palo Alto to service customers here, according to Tom Gage, CEO of AC Propulsion who spoke on the panel. (Gage drives an eBox, an electric car that gets 120 miles on one charge. The company's first customer was actor Tom Hanks.) AC Propulsion also supplies technology to Tesla Motors.

So why is Silicon Valley such a hotbed for alternative cars? It's the customers.

"The driving public here is among the most enlightened in environmental and policy issues," Gage said.

CalCars' Kramer, added to the sentiment: "The plug-in hybrid is the first thing to come here because of popular demand," he said, referring to the movement behind CalCars, Ourpower.org, and Google's plug-in effort. "There's a different customer here in the Valley, and that's why we favor this area."

Backing up his point, 30 percent of the people in the audience said in a poll that they drove a hybrid to the conference.

Byron Shaw, managing director of the Advanced Technology Office at General Motors and who's based here, spoke on the panel about the goals of GM, which is one of the first major car companies to say that it will develop a plug-in hybrid. Shaw said that the company plans to introduce the first rendition of the plug-in Chevy Volt in 2010 along with similar versions for the Saturn. He said that GM will also sell a bevy of alternative-fuel vehicles in the next decade, including electric cars, fuel cell cars, and vehicle-to-grid plug-ins.

"There's an opportunity to bring Silicon Valley and the auto industry together because the two don't always march to the same drum," he said. "We have a wealth of experience of building vehicles, but there are things changing that now, such as the conventional cost of fossil fuels. In the same way Silicon Valley has driven down costs of technology, it may happen with the auto industry, too."

That said, GM is slower than the technology industry, he said, and the company is driven by a fickle consumer. One consideration, for example, is that the battery for a hybrid plug-in must operate well in cold climates like Minnesota as well as warmer places like Phoenix. "The supply base just isn't there for electric vehicles," he said.

AC Propulsion's Gage said that after working in Detroit for eight years, he's seen that car companies can change for the consumer, but it will be especially challenging in the alternative fuel market.

"It's a major transformation for the car companies," he said, "the power train is different; fuel sources are different. We have to start small and build a market base, and it has to appeal to consumers. To come back to this, Silicon Valley is more advanced in this area. Grassroots efforts will continue."

CalCars' Kramer went further with his criticism.

"They're being too slow. It's a major wedge for climate change. They need to learn about versioning--getting cars on the road and seeing what people like," Kramer said.

The panelists finished by predicting how many cars would be electric or plug-in hybrid by 2028. Two of the men, Shaw and Kramer, forecast that it would be 80 percent of cars on the road by then. Gage was more conservative at only 20 percent. The question is: Will that be enough to turn the tide of global warming?

 

Read More...
THE ARTICLES WAS QUOTED FROM VARIOUS SOURCES

SeeITNews
email:seeitnews@gmail.com