VMware's third-quarter profit dropped by more than half as the company invested more heavily in research and development, though its revenue increased 4 percent from a year earlier, the company said Wednesday. Including one-time items, the company made a profit of $95 million, or $0.24 per share, better than the $0.20 per share analysts had been expecting, according to Thomson Reuters. Net income for the quarter ended Sept. 30 was US$38 million, or $0.09 per share, down from $83 million, or $0.21 per share, in the third quarter of 2008, VMware said.

Revenue for the quarter was $490 million, up from $472 million for the same period last year. The economy remains challenging but VMware now has better visibility into the coming quarters, Chief Financial Officer Mark Peek said in a statement. Analysts had been expecting $474 million in revenue, Thomson Reuters said. He forecast fourth-quarter revenue of $540 million to $560 million, ahead of current estimates, but said revenue in the first quarter next year is likely to be down sequentially. Sales and marketing expenses also climbed, as the company continued to promote a new version of its virtualization software, vSphere 4. "Our solid third-quarter results were driven by strength in the U.S. federal sector, increased transaction volumes and particularly robust growth in our maintenance renewals," Peek said in the statement. VMware spent $133.5 million on R&D during the quarter, up from $85 million in the same period last year.

A 1 percent drop in revenue from the U.S. was more than offset by a 9 percent increase in international sales, VMware said. Services revenue, which includes maintenance fees and professional services, climbed 33 percent to $249 million, while revenue from new software licenses declined 16 percent to $240 million.

The European Union should update its laws to better suit cloud computing, Microsoft's top lawyer urged in Brussels Tuesday. The existing E.U. laws "are starting to show their age," he told students, E.U. officials and industry executives. Laws covering data protection and data retention go back to the mid-1990s and need to be revised to take into account the massive and constant flow of data between users' computers and multiple cloud servers that can be located anywhere in the world, Microsoft head legal counsel Brad Smith said in a speech.

The data retention law passed in 2006, when cloud computing was mostly limited to online e-mail accounts, calls on the 27 E.U. member states to each set their own length of time for the retention of data, between six and 24 months. A request to access the information can be done only with a court order. Under the directive authorities can request access to details such as IP address and time of use of every e-mail, phone call and text message sent or received. Smith said countries have chosen widely varying retention times. The discrepancy makes compliance by telecom and cloud services companies complicated, Smith said. A subscriber to cloud services in, say, Italy, with a relatively short data retention regime, may see their data stored in Ireland, with one of the longest retention times.

He urged the E.U. to change the law, either by setting a standard length for retention time, such as a year for all E.U. member states, or by applying mutual recognition principles, so that each country applies the retention time of the country where the user's data is being stored. In the cloud it is constantly moving," he said. Greater harmonization is needed in the 1995 data protection directive too, Smith said. "At the time it was passed, data moved location occasionally. While technology and its uses have moved on since 1995, the importance of privacy hasn't changed, Smith said. I disagree. And in a thinly veiled attack on Microsoft rival Google, he added: "Some firms based on the West Coast (of the U.S.) don't think privacy is so important.

The right to privacy is the right to choose who sees your information and it is as important now as it was 15 years ago."Data transfers are global, and while Smith acknowledged that a global data protection treaty is unlikely in the near future, he suggested that trading partners including the U.S. and E.U. should try to forge an agreement similar to the bilateral free trade agreements they both sign up to in other commercial fields. For wireless connections, he urged the E.U. to free up more radio spectrum for services that run in the cloud. His final piece of advice for E.U. lawmakers concerned broadband access to the Internet – a prerequisite if cloud computing is to take off in the way many people predict it will in the coming years."Governments need to consider how to enhance broadband, both wired and wireless," Smith said.

Verizon mocks AT&T for whining in its response to lawsuit. Tony Bradley Was this article useful? Verizon's legal filing says "the truth hurts" and claims AT&T is simply trying to squash the marketing before the holidays. Yes 76 No 1 Verizon has responded to the AT&T lawsuit over the "There's a Map for That" marketing campaign with a legal filing of its own.

Legal briefs and filings don't usually make very compelling reading, but this one is actually a pretty good read. Verizon's message to the court and to AT&T essentially boils down to three words: "the truth hurts". Verizon responds to AT&T lawsuit by stating 'the truth hurts'The Verizon legal team should be commended. It has a little drama, a little humor, and ultimately makes the point that AT&T is simply trying to use the courts to obscure the simple truth that its 3G network is inadequate. AT&T may not like the message that the ads send, but this Court should reject its efforts to silence the messenger." The legal response from Verizon concludes by saying "This motion is a blatant effort to ask the Court to do what the marketplace will not do: shield AT&T from truthful comparative advertisements that Verizon has a right to air and that consumers have a right to see." Touché. Well-played, Verizon. In the filing, Verizon states "Remarkably, AT&T admits that the 3G coverage maps-the one thing that is common to all five ads-are accurate and that the ads' express statement that Verizon has "5X More 3G Coverage" than AT&T is true." Verizon goes on to say "In the final analysis, AT&T seeks emergency relief because Verizon's side-by-side, apples-to-apples comparison of its own 3G coverage with AT&T's confirms what the marketplace has been saying for months: AT&T failed to invest adequately in the necessary infrastructure to expand its 3G coverage to support its growth in smartphone business and the usefulness of its service to smartphone users has suffered accordingly. Recent reports from Brandindex.com show that Verizon's brand image is skyrocketing, while AT&T's has plummeted.

Or, perhaps AT&T's claim has become a sort of self-fulfilling prophecy. AT&T could construe the reports to support its claim that the ads are misleading and damaging to its image. By suing Verizon and making a big deal over the ad campaign, AT&T has increased the exposure of the ads exponentially. The fact that AT&T is whining about it publicly certainly doesn't improve its brand perception either. If the shift in brand perception is a result of the ads, it could be because they're true rather than because they're misleading. AT&T was hoping to get an emergency injunction to force Verizon to pull the entire marketing campaign pending the results of a full hearing.

Even if AT&T ultimately prevails, the holidays will be over, the damage will be done, and Verizon will most likely have ended the campaign of its own volition anyway. It appears that AT&T will not get that injunction and that this case will not be heard for awhile. Verizon isn't making any claim in the ads that AT&T's own customers haven't already stated. I doubt AT&T will be taking its customers to court to get them to stop complaining. Consumers have taken issue with AT&T's high dropped call rate in some areas, sparse 3G coverage, and more.

AT&T is its own worst enemy in this case. Seems like a lose-lose and makes me wonder if the AT&T legal team isn't secretly working for the Verizon marketing team. It is drawing attention to disparaging details about its own network which it admits are accurate, and coming off looking like whiners at the same time.

IT professionals looking to boost their high-tech careers in the coming five years are betting on security certifications and skills to help them stand out to potential employees, according to a new survey. Separately, nearly 20% indicated they would seek ethical hacking certification over the same time period. Five ways to get affordable certification skillsSome IT skills see pay hikes during the downturn CompTIA, an IT industry trade association, polled some 1,537 high-tech workers and found 37% intend to pursue a security certification over the next five years.

And another 13% pinpointed forensics as the next certification goal in their career development. "When you add the results, you will see that about two-thirds of IT workers intend to add some type of security certification to their portfolio," says Terry Erdle, senior vice president of skills certifications. "This trend is driven by two factors: one, security issues are pervasive, and two, more and more people are moving to managed services and software-as-a-service models, which involves more complex networking. Nearly 90% said they want to spruce up their resumes and another 88% said they hope to grow personally with new certifications. That level of non-enterprise data center computing has people look more closely at their security infrastructure." High-tech workers surveyed cited economic advancement and personal growth as the motivation to seek further certifications. Emerging technologies and vertical industry trends also drive certifications seekers. Green IT, mobile and healthcare IT also placed among high-tech career development plans. "We are going to see upwards of 70,000 IT jobs in healthcare and the related network and storage skills that come with electronic records, such as e-reporting and e-charting," Erdle adds. "We are working now to determine what kind of IT roles should be supported in certifications from CompTIA." Do you Tweet? For instance, SaaS ranked among the technologies in which IT workers intend to seek certifications in the coming years.

Follow Denise Dubie on Twitter here.  

Cisco this week enhanced its IPv6 offerings for its carrier core and edge routers in an effort to ease the eventual migration from IPv4. The Carrier-Grade IP Version 6 Solution line includes a new hardware module for Cisco's CRS-1 router, and software for that system as well as for the ASR 9000 edge router. By 2015, there will be 15 billion IP endpoints on the Internet. Cisco also unveiled professional services offerings to assist customers in the transition from IPv4 to IPv6. The evolution of the Internet Cisco says there are 700 days left until the last block of IPv4 addresses are allocated. The IPv6 enhancements rolled out this week are intended to provide a bridge from IPv4 to full IPv6 network while at the same time preserving existing IPv4 addresses to ease the migration.

IPv6 has 340 undecillion unique addresses - or more than 50 billion billion billion - for each person on earth, more than enough to continue to support the demand for IP addresses, Cisco says. IPv4 has a finite set of unique addresses, numbering approximately 4 billion, which is rapidly depleting due to the growth of Internet-connected devices and smart devices. However, the protocols of IPv4 and IPv6 are not directly compatible, so migrating a network from IPv4 to IPv6 requires preservation of IPv4 while orchestrating a gradual and prudent transition to IPv6. This has been a chief reason why the industry has been procrastinating on this migration even though IPv6 was developed a decade ago. It's probably something to take seriously two to five years from now but (carriers) have to start to prepare." With that, Cisco unveiled the Carrier-Grade Services Engine for the CRS-1. Deployed deep in the core of service provider's network, this module supports large-scale, high-throughput network-address translation (NAT). At the edge, Cisco rolled out Carrier-Grade IPv6 Solution for its ASR series routers. But with IPv4 addresses facing imminent depletion, the time may have come to accelerate the adoption of IPv6. "I do think we've reached the point where we should be concerned about it," says Glen Hunt, an analyst at Current Analysis. "The biggets problem might be that we've been crying wolf about IPv6 and defining ways to get around attacking the problem. This is software that helps enable NAT at the edge of a network for smaller or distributed IP networks.

These are professional services designed to make the transition to IPv6 smooth and reduce the risk to network operations. The software is intended to first tunnel IPv6 addresses through IPv4; and then perform the inverse function as IPv6 addresses outnumber IPv4. Lastly, Cisco is offering services for the Carrier-Grade IPv6 Solution implementation. The services include initial planning and IPv6 readiness assessment to design and implementation. All products will be available in early 2010. Cisco did not disclose pricing.

A look back at the week's biggest Google-related news stories:   Google, Verizon unite on Android devices  Verizon and Google have entered into an agreement to jointly develop wireless devices based on Google's open source Android mobile platform. Verizon says that it will have two Android-based handsets on the market by year-end with more to come by 2010   Google celebrates anniversary of bar code patent  Google's "doodle" on its search home page Wednesday was a bar code that presumably translated into the word "Google". It also said happy 57th anniversary to the awarding of a patent for the bar code by Joseph Woodland and Bernard Silver. During a teleconference Tuesday, Verizon CEO Lowell McAdam and Google CEO Eric Schmidt outlined the companies' new strategic partnership that will see them working together to develop Android-based smartphones, PDAs and netbooks, and to deliver users with applications sold through the Android Market app store. It also coincided with the announcement earlier this week of the Nobel Prize for Physics to Charles Kao for his work on fiber-optic communications and Willard Boyle and George Smith, who invented imaging technology using a digital sensor dubbed a CCD (Charge-Coupled Device). The CCD has enabled developments such as bar codes/bar code readers to come along.   Gmail, other webmail passwords stolen  In the wake of the posting in online forums of stolen account and password information for thousands of Hotmail, Gmail and Yahoo e-mail accounts, evidence emerged of yet more abuse that entails attackers exploiting that information to hack into compromised accounts over the last few days to send spam aimed at stealing credit cards.

Attackers have been taking advantage of the exposed account information for Hotmail, Gmail and Yahoo to break into the victim's e-mail accounts and send out deceptive messages to the victim's contacts to promote the scam.   Google Voice in the middle of things AT&T buoyed the spirits of Google Voice fans this week by saying it would allow the application to run on its network, but later in the week word emerged that lawmakers want the FCC to look into whether Google Voice blocks calls to people in rural areas because they are expensive to connect. According to Patrik Runald, senior manager security research at Websense, the security firm noticed about a 40% surge in spam related to Yahoo, Gmail and Hotmail accounts in recent days, with some of the spam being a phishing scam related to a fake Chinese electronics shopping site. And guess which big carrier is encouraging the lawmakers in this pursuit?   Google Squared freshens up  PC World's David Coursey writes that "Google Squared, the ambitious project that delivers search results as a table, has received an update that improves both the quality and quantity of the information it presents." He cites a post on Google's blog that the update results will allow up to four times as many facts to be squeezed into a square.   Google, Microsoft woo Twitter  Various reports (All Things Digital, Reuters, etc.) had Google and Microsoft chatting with Twitter separately about how to best integrate Twitter with outside search engines.   Google: DRAM, DRAM, DRAM!  Computerworld reports that Google and the University of Toronto released a study of tens of thousands of Google servers showing that "data error rates on DRAM memory modules are vastly higher than previously thought and may be more responsible for system shutdowns and service interruptions." For more on Google, visit Network World's independent Google community, Google Subnet.

Sophos has added data-loss prevention capabilities to its desktop anti-malware software, the security firm says, promising gateway-based DLP in the future as well. Rainer Gawlick, chief marketing officer for Sophos, says the desktop DLP technology is designed to monitor for sensitive content a user might transmit via e-mail, Web uploads, USB sticks or DVDs, and block it if need be. "We're using the existing agent to scan malware and for confidential information," he says, noting for existing Sophos endpoint-security customers, the DLP technology is a free upgrade. 11 security companies to watch Many DLP systems today cost hundreds of thousands of dollars; however Sophos doesn't claim the DLP functionality it packed into Endpoint Security and Data Protection 9 is the same found in high-end endpoint/gateway DLP products. Endpoint Security and Data Protection 9 combines a fully-integrated desktop agent for DLP and malware protection in a single product. For instance, the Sophos product doesn't have a DLP discovery tool, and it basically works by focusing on personally-identifiable data such as credit-card and driver's license numbers.

DLP is a scanning problem. But it allows for the creation of customized rules in order to identify and catch file data through discrete triggers, such as project codes, that could be added to documents as identifiers. "There's pressure increasing on customers every day to protect data," Gawlick says. "What we're doing is making DLP implementable. We're using the existing agent to scan for malware and scan for confidential information." Some organizations may find they do need more sophisticated DLP, Gawlick says, but others may find what Sophos has come up with to be "practical DLP" that makes it hard for people to violate content-protection policies. Raymond had been using another standalone DLP product he declined to name, noting there had been minor interference at times among separate agent software. John Raymond, IT security manager at Sacramento-based SAFE Credit Union, has long used Sophos anti-malware software in his organization, and for the past two months has been beta-testing the DLP features in Endpoint Security and Data Protection 9. The single-agent DLP/anti-malware software from Sophos appears to have less of a software footprint and uses less CPU resources than having a separate DLP and anti-malware agent, he says.

While the other DLP agent made use of a type of fingerprinting technology that might be more accurate than the Sophos DLP, so far it appears that the Sophos technology is going to be sufficient to protect sensitive customer data. "I think it's going to be accurate enough for what we need," Raymond says, adding that the firm also has other gateway means to monitor for unauthorized transmissions. Administrators can be kept apprised of DLP actions via the same management console used to watch for malware events. According to Gawlick, Sophos DLP on the desktop can be used to block, allow or notify a user of violations of policy. "You can use it to simply warn the user," Gawlick says, by telling them if the file is sent, that transmission will be logged into the management console. Although Sophos DLP doesn't support FTP now, subsequent editions will, and Sophos also plans a gateway-based product by the end of the year that would work with the same set of DLP rules. Sophos Endpoint Security and Data Protection 9, available now, starts at $34.25 per user for 500 users.

Intel Corp. is honing its sights on many-core chips that are far more powerful than today's dual and quad-core processors. Intel also noted that the experimental chip uses the same amount of energy as two household light bulbs. As expected, Intel took a big step in that direction today by unveiling a 48-core research chip that it says is 10 to 20 times more powerful than current the top end offering in its multi-core Core line of processors. With its eye on the data center and the cloud, Intel built fully functional cores in the new chip as part of what it calls its "terascale" mission. "With a chip like this, you could imagine a cloud data center of the future which will be an order of magnitude more energy efficient than what exists today, saving significant resources on space and power costs," said Justin Rattner , Intel CTO and head of Intel Labs. "Over time, I expect these advanced concepts to find their way into mainstream devices, just as advanced automotive technology such as electronic engine control, air bags and anti-lock braking eventually found their way into all cars." Today's unveiling of the 48-core research chip comes about two years after Intel showed off an experimental 80-core chip.

The 80-cores were not fully functional, however, and the chip was used mainly to study ways to make a large number of cores communicate efficiently with each other, as well as help Intel engineers find new architectural and core designs. That research chip had teraflop performance capabilities but used less energy than a quad-core processor. At the time, Intel officials said that the company was five to eight years away from building a fully functional, commercial-ready 80-core chip. Intel reported today that it is bringing academics and experts from other high tech firms into the loop by distributing 100 of the experimental 48-core chips so researchers can work on programming models and on developing software that can run on such a high number of cores. In an interview with Computerworld last month, Rattner said that schedule has changed and that engineers are even closer to developing such a chip. The chip maker, which is slated to unveil six- and eight-core Nehalem chips next year, also noted that it expects to integrate key features of the research chip into a new Core line of commercially available processors by early 2010. "This is an indication that Intel can deliver on its multi-core strategy," said Rob Enderle, an analyst with the Enderle Group. "It's very important in that it helps validate what Intel contends can be done and it adds credibility to their roadmap.

This one is more of a prototype - less flashy but more functional. The 80-core chip was more for bragging rights and was more of a science experiment. It is all part of the process of bringing something new to market." Intel reported that the 48-core chip is designed with a high-speed, on-chip network for sharing information, along with newly invented power management techniques that allow it to operate at as little as 25 watts, or at 125 watts when running at maximum performance. Dan Olds, an analyst at The Gabriel Consulting Group, said the new research chip is an important step in the process of building many-core processors, but Intel should release more details about the technology. "This is a fairly important step in the evolution of computer processors," said Olds. "Multi-core chips have become the standard with dual-core and quad-core chips used in almost every system. The company dubbed the experimental chip as a "single-chip cloud computer" because its architecture resembles that of a cloud computing datacenters.

Core counts will definitely increase over time, but it's happened in small steps - two to four cores, four to six cores and with eight and 10 cores coming in the next few years. This is important in that it determines if existing software will be able to run on it without being ported. This 48-core chip from Intel is important from a proof-of-concept perspective." But Olds said that Intel should explain what 48 "fully functional" really means. "We need more information from Intel in order to understand just how big a leap forward this chip really is," he added. "For example, can it handle the standard x64 instruction set? But we don't want to get ahead of ourselves. It's an important science project, one that might be giving important industry stakeholders a usable piece of the future, but still a science project." Enderle added that such research is critical for the whole cloud computing movement. "This technology is a requirement if the concept of cloud computing is ever to reach its potential for cost savings and energy efficiency," he said. "Without it, much of what we imagine with the massively flexible and utility like concept of cloud computing simply won't be possible. At a volume of only 100 units, this is more of a science project than an actual prototype.

This is a very important milestone for the future."

As NASA's space shuttle Atlantis astronauts blasted into space this afternoon, the last thing on their minds was wondering what's for dinner. But once on in orbit it wouldn't be that unusual to find spare ribs for supper. Rather Atlantis astronauts' main focus is delivering over 27,000lbs of spare parts to the International Space Station.

According to NASA shuttle astronauts "have an astonishing array of food items to choose from." NASA says each astronaut is allowed a "bonus food allotment" to bring some of the comforts of home to outer space. You may recall that in 2007, an astronaut was trying to make a pretend sushi meal with bag-packaged salmon and accidentally squirted a blob of spicy wasabi into the air. NetworkWorld Extra:10 NASA space technologies that may never see the cosmos Last year for example the International Space Station received its first crab meat delivery courtesy of NASA. Miller's Select crab meat flew onboard NASA's shuttle mission STS119. Food can represent a peculiar threat to space life. After a lengthy cleanup, the wasabi was exiled to a cargo bay. Salt and pepper are available but only in a liquid form because astronauts can't sprinkle salt and pepper on their food in space - it would simply float away. On the Space Shuttle, condiments are provided such as ketchup, mustard and mayonnaise.

There is a danger ketchup, salt and pepper or other favorites could clog air vents, contaminate equipment or get stuck in an astronaut's eyes, mouth or nose, NASA's Space Food Website says. And this article notes "NASA's food laboratory carefully balances diets between six categories: beverage, rehydratable, intermediate moisture, thermostabilized, irradiated, and natural form." Astronaut Don Pettit brought along small cans of green chilies on one Space shuttle trip. Despite its "threat" to the astronaut, spicy foods are popular in space because most of the food is dried in one form or another and zero gravity does nothing good for sinuses or flavor. On a previous mission, taco sauce had become carefully guarded currency. The taco sauce, he said, also could be used for barter. "If it was your turn to say, clean the latrine, you could trade for two packets of taco sauce," he said. Astronaut Sid Gutierrez once said space shuttle crews always take spicy accouterments like taco sauce to make food taste better.

Some packaging actually prevents food from flying away - always a major concern. Space food however isn't reserved to astronauts thanks to a cook book out this week will tell you everything you wanted to know about cooking and eating in space. This is the reason tortillas are taken along on the flight rather than bread - tortillas make far less crumbs than bread and crumbs are bad because they can potentially float around and get stuck in filters or an astronaut's eye, NASA says. The Astronaut's Cookbook - Tales, Recipes, and More - penned by NASA veterans Charles Bourland and Gregory Vogt offers up a number of recipes as well a history of space feasting just in time for Thanksgiving, if you are so inclined. The book includes astronaut home favorite recipes and NASA quarantine food recipes. The book includes a number of interesting space food facts:-Soviet cosmonaut, Gherman Titov, was the first human to consume food in space-John Glenn, Jr., was the first American to consume food in space-One astronaut wanted Georgia BBQ in space.

Celebrity chefs Rachael Ray and Emeril Lagasse contributed recipes to the NASA space program, and their recipes are featured in the book. Bourland spent 30 years at the NASA Johnson Space Center developing food and food packages for spaceflight. Recipes in the book are extracted from the NASA food specifications and modified for regular, earth-bound kitchens. Vogt is a veteran writer, science consultant, and developer of science and technology materials for schools.