Adobe Gets Its Say on Microsoft's MAPP Security Channel

Adobe (Nasdaq: ADBE) has joined Microsoft's (Nasdaq: MSFT) MAPP program, which provides members with information about security Planning for the next peak season? Ensure your website is fast, secure and available 24/7. Click here to learn how. vulnerabilities before Microsoft releases its monthly patches.
This will let Adobe, which has been plagued by security flaws, notify MAPP members about vulnerabilities in its apps so they can fix those problems more quickly.
Separately, Microsoft has announced a coordinated vulnerability disclosure program which will let anyone who discovers security flaws report them directly to a CERT-CC or other coordinator so vendors of the affected products get the information in time to fix the problem.
Microsoft also released several resources to help customers make informed decisions about security and manage their risk.

Gimme Shelter

Microsoft announced the tie-in with Adobe on Wednesday at the Black Hat USA 2010 conference.
Joining the Microsoft Active Protections Program (MAPP) lets Adobe piggyback on the bulletins Microsoft sends out about newly discovered vulnerabilities in its own applications to the program's 65 global members, which are security vendors. These bulletins are sent out far enough ahead so MAPP members can fix those vulnerabilities before Microsoft issues its regular monthly patches.
"By sharing Adobe vulnerability information with MAPP members prior to the public release of a security update, we give security providers an early start over exploit code writers, enabling them to offer protection to our mutual customers in a timely manner," Adobe's Wiebke Lips wrote about the tie-in.
"Adobe is the first company to publish security information on their own products through what, until now, has been an exclusively Microsoft program," Andrew Storms, director of operations at nCircle, told TechNewsWorld.
MAPP members include Cisco (Nasdaq: CSCO), Symantec (Nasdaq: SYMC) and McAfee, Dave Forstrom, director of Microsoft's trustworthy computing group, told TechNewsWorld. However, Adobe won't exactly be a member of the program.
"Adobe is not part of that group of 65, as it's partnering with Microsoft to share early warning details of vulnerabilities with them," Forstrom pointed out.

The Road to Rehabilitation

Adobe has been plagued by security vulnerabilities, and its Flash Player is among the favorite vectors of attack used by hackers and malware developers because it's so widespread. Hackers also like attacking through PDF files for the same reason.

Flash has been exploited enough by cybercriminals that Apple (Nasdaq: AAPL) CEO Steve Jobs publicly mentioned its security vulnerabilities in an open letter earlier this year.
"I think only Adobe has been as popular as Microsoft with cybercriminals," Roel Schouwenberg, a senior antivirus researcher with Kaspersky Lab Americas, said.
Teaming up with Microsoft to provide advance warning of security flaws may help Adobe restore its reputation.
"This is a smart move on Adobe's part, and it may eventually help them rehabilitate their tattered security reputation," nCircle's Storms pointed out.
"The advantage for Adobe is that this move will make it much easier for security companies to create reliable detection and mitigation strategies for flaws in its products," Schouwenberg told TechNewsWorld.
The team-up with MAPP is Adobe's second major security move this month. A week ago, Adobe introduced Adobe Reader Protected Mode. This is based on Microsoft's Practical Windows Sandboxing technique, and prevents hackers from accessing a user's computer through PDF files.

One Big Happy Anti-Cybercriminal Family

At the Black Hat conference, Microsoft also pushed its coordinated vulnerability disclosureapproach to fighting cybercrime.
This calls for anyone who discovers new vulnerabilities to disclose the information directly to the vendors of the affected products or to a CERT coordination center or other coordinator.
The CERT coordination center, or CERT/CC, identifies and addresses existing and potential security threats; notifies system administrators and other technical personnel of those threats; and coordinates with vendors and incident response team worldwide to address those threats.
This early disclosure will give the affected vendor enough time to diagnose and offer fully tested updates, workarounds or other corrective measures before detailed vulnerability or exploit information is made public, Forstrom said.

Who's to Blame?

Perhaps the move is a response to Google (Nasdaq: GOOG) researcher Tavis Ormandy's public disclosure in June of a security flaw in Microsoft's Help and Support Center in Windows XP and Windows server 2003. Microsoft suggested workarounds that drew criticism from the security community because they led to other problems.
"Microsoft is attempting to get a broader consensus in responsible disclosure by eliminating the hot button of calling irresponsible people like Tavis Ormandy irresponsible," opined Randy Abrams, director of technical education at ESET. "Ormandy's recent irresponsible disclosure put millions at risk for the sole purpose of inflating his ego while helping a few good guys, a ton of bad guys, and putting many in harm's way."
However, Redmond has to share part of the blame for people publicly disclosing information about vulnerabilities and exploits before the affected vendors have addressed these, Abrams told TechNewsWorld.
"The responsible disclosure process is, to a large degree, a problem Microsoft participated in creating with years of irresponsible reactions to responsible disclosure," Abrams pointed out. "Microsoft has dramatically improved the appropriateness of its responses to vulnerabilities, but it takes a lot longer to rebuild than to tear down." 

Dell upgrades the Studio XPS Desktop line


New Dell Studio XPS desktop delivers latest technology at sub-$1,000 mainstream price
Features new Intel Core i7 processor with the power to take on the most demanding digital duties today – and tomorrow
New Studio XPS brand spotlights premium products, featuring state-of-the-art technology, stylish design
Dell has shifted its season of design and discovery into high gear with the launch of the Studio XPS desktop, a premium PC that features the new ultra fast Intel® CoreTM i7 processor. This is the first Studio XPS PC which represents the best of the Studio product portfolio: premium products that combine state-of-the-art technology, great performance and stylish design.
Available today on www.dell.com/Studioxps, the Studio XPS desktop starts at $949 (without monitor) and can be customized with a variety of complementary high-performance components. Technology enthusiasts will appreciate that the Studio XPS desktop with a Core i7 processor can deliver up to 44 percent faster video editing and encoding, plus outstanding performance on applications like image rendering, photo retouching and editing.1
The Studio XPS desktop packs a lot of the same performance and features found in full-size high-performance gaming rigs, like the Alienware Area-51 X-58 and the XPS 730x, into a smaller, more compact mini-tower featuring glossy black finish that looks sharp in the home office or the entertainment center. All systems share the new Hyper-Threading and Turbo Boost technology of Intel’s new Core i7 processor. This technology provides dynamic speed increases on demand and results in up to 65 percent performance improvement for the latest multithreaded applications.2
Holiday shoppers can purchase a Dell Studio XPS desktop with Intel’s new Core i7 processor for $949. For the next 48 hours they can purchase the new Studio XPS desktop for the special introductory price of $999, including 4GB of memory and a 20-inch flat panel monitor. See www.dell.com/xpsevent for more information.
The Studio XPS desktop configured with an optional TV tuner is an ideal choice as the center of a home entertainment system:
All graphics options include a built-in HDMI port for easy connection to most LCD TVs.
Support for up to 1TB of hard drive space to store music libraries, favorite movies and TV shows.
Optional Blu-ray DiscTM drive can alleviate the need to purchase, make room for and set up a separate Blu-ray player.
Compact size fits easily in smaller spaces: 14.2-inches x 6.7-inches x 17.1-inches.
The Studio XPS desktop features 64-bit versions of Microsoft® Windows Vista® so it can take advantage of up to 12GB tri-channel DDR3 memory, which can improve performance in multitasking and memory-intensive applications like photo and video editing software.
A plethora of ports on the Studio XPS desktop make it easy to connect accessories and peripherals: eight USB 2.0 (four front, four back), two IEEE 1394 (4-pin front, 6-pin back), microphone/line in (front and back), built- in 7.1 audio support (six connectors), S/PDIF, eSata (all back).

Robot Touchscreen Analysis

Robot Touchscreen Analysis from MOTO Development Group on Vimeo.

A touchscreen is a touchscreen, right? Hardly! As MOTO pointed out in our recent Do-It-Yourself Touchscreen Analysispost, “All touchscreens are not created equal.”

With our simple test technique — which basically consists of using a basic drawing application and a finger to slowly trace straight lines on the screen of each device — it’s easy to see the difference in touchscreen resolution from one phone to the next. Results with straight lines indicate a high degree of sensor accuracy; less-precise sensors show the lines with wavy patterns, stair-steps, or both.

After we published our first comparison of four touchscreen smartphones, a few critics found fault with our DIY testing technique. Many of of these comments centered around the idea that our human-finger methodology is prone to inconsistency, due to variables in finger pressure, line-straightness, or tracing speed.

Human Error?

Our response to these arguments is pretty simple: These are all fair points. Nevertheless, we’re confident that such inconsistencies do not distort the basic results of our touchscreen shootout. In other words, the inconsistencies are real, but they don’t make much difference.

Nevertheless, to satisfy the critics, we decided to give them exactly what they asked for: We wrote a script for MOTO’s laboratory robot and then re-ran the comparison to see how the touchscreens stack up when the lines are drawn by our robot’s slow and precise “finger.” (See the robot in action, in video below.)

Add Some New Contenders

Before running the robot test, we also decided to satisfy the many requests we received to add the Palm Pre and the Blackberry Storm 2 to the mix. How did the new phones perform? The Blackberry and the Palm touchscreens both performed fairly well. The iPhone still retains its crown as King of the smartphone touchscreens, with the Nexus One in a distant second. Take a look:

Understanding the Results

Touchscreen performance variation occurs because there is no out-of-the-box solution for manufacturers that hope to install multi-touch screens in consumer electronic devices.

To get it right, gadget-makers have to assemble a variety of critical elements — screen hardware, software algorithms, sensor tuning, and user-interface design, to name but a few — and then refine each component of the stack to deliver the best touchscreen experience possible. It’s a complex and laborious process that requires extremely close collaboration between multidisciplinary teams, as well as a high-level vision for a quality end-user experience.

Indeed, from a consumer perspective, what matters most isn’t the performance of the touchscreen itself, but how well a touchscreen performs in combination with its operating system and user-interface to deliver an experience that is satisfying overall.

Still, it’s useful to look at touchscreen performance in isolation, because it is a central ingredient in the mix and a good indicator of how satisfying a touchscreen experience is likely to be.

Watch the video for the full story. (Mobile viewers click here.)

Does the Drawing App Make a Difference?

Some readers who saw our last DIY Touchscreen Analysis post wondered what drawing applications we used, and if the drawing application could influence the results by either compensating for or distortng hardware performance.

Developers who create drawing apps sometimes add smoothing algorithms to make the input look more natural. However, the artifacts of these algorithms are fairly easy to identify with casual exploration. We chose drawing applications which we found not to do minimal (if any) smoothing of the input data.

In any case, smoothing is most effective only if you are moving quickly – with the snail-like pace of the test robot, you can see that the data, as captured, appears immediately on the screen and never changes to a “smoothed” version.

Of course you don’t have to take our word for it – try it yourself! Here are the apps we used:

  • Blackberry Storm: Canvas
  • iPhone: SimpleDraw
  • Droid Eris, Droid: DrawNoteK
  • Palm Pre: Super Paint
  • Google Nexus One: SimplyDraw

Human v. Robot

Finally, as predicted, the lineup below shows how our simple finger-test correlates quite closely with the more formal results when we got when we used our ultra-precise, ultra-consistent robot in MOTO’s laboratories:

Indeed, notice that by and large, the results look even worse in the robot tests. That’s because the robot drew lines at only a quarter-inch per second — much more slowly than our ” DIY test.

And as we we’ve explained previously, low speed is crucial to testing the true performance of the screen, because tracing high speeds skips over the many data points captured at slow speed, causing lines to look straighter than they actually are. Because the robotic finger is somewhat less compliant than a human finger, it is a little harder to detect. This confuses poor screens even more than when humans attempt the test.

A Prediction

In the long run, however, we don’t expect this high degree of touchscreen variation between handset manufacturers to continue in such dramatic form.

Right now, capacitive touchscreens are a relatively new feature to appear in consumer electronics products. And as we’ve pointed out several times before, creating a seamless touchscreen experience is hard work that requires a high level of commitment to technology integration and interdisciplinary teamwork. Over time more brand-name manufacturers will acquire the expertise required to deliver excellent touchscreen products.

We know for a fact that the solutions in these phones (other than the iPhone) are all last-generation silicon and touch panel components – the other touch screen makers are hard at work perfecting their new solutions, and they may just leapfrog Apple in some areas when they arrive on the market over the next year.

Just consider the “door slam test” that’s often used to evaluate the build-quality of automobiles. Like touchscreen devices, cars are complex machines that require a high level of system integration. A decade ago, the difference in quality between established manufacturers like BMW or Mercedes and a relative newcomer like Hyundai was dramatic. A door-slam on the former felt solid and precise; the latter felt loose and tinny. Yet today Hyundai has closed the gap, and many of the company’s cars pass the door-slam test in world-class style.

In other words, practice can help make perfect. It’ll be interesting to re-run our touchscreen test a year or two from now to see how the playing field starts to even-up.

When Four Cores Aren't Enough: Intel's Core i7-980X Extreme Edition

Intel has announced its latest Extreme Edition processor, the Core i7-980X. Like the recently released 2010 Clarkdale lineup, the i7-980X (previouslycode-named Gulftown) brings Intel's turbo boost and hyperthreading technologies to the 32nm process. The i7-980X is also Intel's first processor with six physical cores, offering increased system performance in applications optimized to take advantage of them.

The Core i7-980X will essentially replace Intel's current performance king, the 45nm Core i7-975 Extreme Edition. While the Core i7-975 will still be available, the new six-core processor will be offered at the same $999 price point--that's six cores for the price of four! But how much of a difference can two extra cores make?

At a glance, the Core i7-975 and the Core i7-980X are identical. Both sport a base clock speed of 3.33GHz, report a TDP rated at 130W, and support three channels of DDR3-1066 memory. But two additional cores means that the processor has 12 threads for an application to work with, versus four cores and 8 threads in the i7-975.

Intel GulftownYou'll also find a 12MB L3 cache shared across all six of those cores, as opposed to the 8MB cache in the i7-975. A processor's cache functions as a memory storage area, where frequently accessed data remains readily accessible. A larger L3 cache shared across all six cores allows data to be exchanged among them far more readily, improving performance in multithreaded applications. With a large cache and four extra virtual threads, you'd expect to find the greatest appreciable performance difference between the two chips in applications designed to take advantage of multiple cores--and our test results reflected as much.

For our tests, Intel provided a pair of DX58SO motherboards. Serial upgraders should be pleased to note that the Core i7-980X is compatible with existing X58 chipsets. Just drop it into your existing motherboard, and you're (almost) ready to go; we had to perform a required, but painless, BIOS update. Our second test bed was equipped with the aforementioned Core i7-975 Extreme Edition processor. Both test beds also carried 6GB of RAM, 1TB hard drives, ATI Radeon HD 5870 graphics cards, and optical drives for loading software. We ran all of our tests on Windows 7 Ultimate Edition (64-bit).

Intel Core i7-980XIntel is pitching the Core i7-980X as the the premier part for the enthusiast gaming crowd. In our tests, we did see some improvements over the Core i7-975, but they were marginal. In Unreal Tournament 3 (1920-by-1200 resolution, high settings), the Core i7-980X cranked out 159.9 frames per second as compared to the Core i7-975's 155.4 fps, a 2.8 percent improvement. In Dirt 2, the Core i7-980X offered 73.3 fps, against the Core i7-975's 71.7 fps--a 2.2 percent increase.

Those results are hardly surprising. Despite the proliferation of multicore processors, many modern video games have yet to take full advantage of multithreading. Sega's recently released Napoleon: Total War and Ubisoft's upcoming R.U.S.E. have both touted their Core i7-980X optimization, claiming greater detail and realism thanks to simply having more physical cores to work with.

Other games boasting optimization for more than four processor threads include Ubisoft's Far Cry 2, Capcom's Resident Evil 5, and Activision's Ghostbusters. That being said, if you recently sprang for a Core i7-975 and are strictly a gamer, there's no need to curse your poor timing--at least, not until more developers fully commit to the multithreaded bandwagon.

If, on the other hand, you spend much of your time working with multithreaded applications--including Blender, Adobe Photoshop, and Sony Vegas Pro--coughing up $1000 for your workstation's processor might not necessarily be a bad idea.

The most tangible results will be apparent in applications designed to sprawl across as many cores as possible. Take Maxon's Cinema 4D, 3D animation software used by professionals in numerous industries. In Maxon's Cinebench CPU benchmark--which can utilize up to 64 processor threads--the six-core i7-980X saw a 40 percent improvement in performance over the quad-core i7-975.

When considering a processor with a 130W TDP, there's a good chance that saving a few bucks on your energy bill isn't your chief concern. Nevertheless, the Core i7-980X does offer perceptible gains over the i7-975. With all power-saving features disabled, power utilization at peak levels for the i7-980X was 210 watts, versus the i7-975's 231 watts. That's a 10 percent difference in what seems like the wrong direction, indicative of the potential power savings of the smaller 32nm process.

There's a lot to like here, but that's to be expected--this is a $1000 piece of silicon, after all. As far as gamers are concerned, the i7-980X may not blow the i7-975 out of the water currently, but in this case the performance bottleneck lies in the lack of available multithreaded offerings--a trend that's already begun to change. If this chip is in your price bracket, it's well worth the cost of entry provided that you haven't plunked down for an Extreme Edition processor too recently. And as multicore processors and multicore-optimized applications become increasingly common, you'll be able to put all six of those cores to good use--for work and play.

Google Goes Social with Google Buzz


It’s official: Google has just announced Google Buzz, its newest push into the social media foray. This confirms earlier reports of Gmail integrating a social status feature.

On stage revealing the new product was Bradley Horowitz, Google’s vice president for product management. While introducing the product, Mr. Horowitz focused on the human penchant for sharing experiences and the social media phenomenon of wanting to share it in real time. These two key themes were core philosophies behind Google Buzz.

“It’s becoming harder and harder to find signal in the noise,” Bradley stated before introducing the product manager for Google Buzz, Todd Jackson.

Here are the details:


Google Buzz: The Details



- Mr. Jackson introduced “a new way to communicate within Gmail.” It’s “an entire new world within Gmail.” Then he introduced the five key features that define Google Buzz:

- Key feature #1: Auto-following

- Key feature #2: Rich, fast sharing experience

- Key feature #3: Public and private sharing

- Key feature #4: Inbox integration

- Key feature #5: Just the good stuff


- Google then began the demo. Once you log into Gmail, you’ll be greeted wiht a splash page introducing Google Buzz.

- There is a tab right under the inbox, labeled “Buzz”

- It provides links to websites, content from around the web. Picasa, Twitter, Flickr and other sites are aggregated.

- It shows thumbnails when linked to photos from sites like Picasa and Flickr. Clicking on an image will blow up the images to almost the entire browser, making them easier to see.

- It uses the same keyboard shortcuts as Gmail. This makes sense. Hitting “R” allows you to comment/reply to a buzz post, for example.

- There are public and private settings for different posts. You can post updates to specific contact groups. This is a lot like Facebook friend lists.

- Google wants to make sure you don’t miss comments, so it has a system to send you an e-mail letting you know about updates. However, the e-mail will actually show you the Buzz you’ve created and all of the comments and images associated with it.

- Comments update in real time.



- @replies are supported, just like Twitter. If you @reply someone, it will send a buzz toward an individual’s inbox.

- Google Buzz has a “recommended” feature that will show buzzes from people you don’t follow if your friends are sharing or commenting on that person’s buzz. You can remove it or change this in settings.

- Google is now speaking about using algorithms to help filter conversations, as well as mobile devices related to Buzz.


The Mobile Aspect



- Google buzz will be accessible via mobile in three ways: from Google Mobile’s website, from Buzz.Google.com (iPhone and Android), and from Google Mobile Maps.

- Buzz knows wher you are. It will figure out what building you are and ask you if it’s right.

- Buzz has voice recognition and posts it right onto your buzz in real-time. It also geotags your buzz posts.

- Place pages integrate Buzz.


- In the mobile interface, you can click “nearby” and see what people are saying nearby. NIFTY, if I say so myself.

- You can layer Google MapsGoogle Maps with Buzz. You can also associate pictures with buzz within Google Maps.

- Conversation bubbles will appear on your Google Maps. They are geotagged buzz posts, which lets you see what people are saying nearby.

Intel and Micron first to 25nm with new flash memory chips


Intel and Micron plan to unveil new 25-nanometer flash memory chips on Monday via their IM Flash Technologies joint venture, the first commercial chip products made using advanced 25nm manufacturing technology.
The new 64 gigabit (8 gigabyte) MLC (multi-level cell) NAND flash memory chip will give the companies a significant cost advantage over rivals, chip market researcher Objective Analysis said in a research note. The research note was inadvertently sent out ahead of an official announcement by Intel and Micron, which is slated for Monday.
An Intel representative confirmed the new chips and said they are aimed at smartphones, solid-state drives (SSDs), and portable media players such as iPods.
"We are currently sampling it with production expected in the second quarter," Intel said via e-mail.
The use of tiny 25nm technology puts the companies ahead of rivals in the flash industry. Samsung Electronics, the world's largest producer of flash memory, is starting work on 30nm technology this year and plans to use it in most production lines by the end of 2010.
The nanometer measurement describes the microscopic size of transistors and other parts on a chip. A nanometer is a billionth of a meter, about the size of a few atoms combined.
Developing smaller chip manufacturing technology is crucial to meeting user demand for small devices that can perform many functions, such as smartphones with built-in music players, cameras and computers. Smaller etching technologies also enable companies to increase chip speed and reduce power consumption.
Advances in chip manufacturing technology also lower costs over time, a major benefit to consumers.
Objective Analysis estimates the manufacturing cost of the new 25nm flash chips will be about $0.50 per gigabyte (GB), compared to $1.75 per gigabyte for mainstream 45nm flash. The market price of flash chips has been hovering around $2.00 per gigabyte, Objective Analysis said, and will likely remain there throughout 2010.
Intel and Micron are currently offering chip samples to customers so they can start to plan them into gadget designs, according to the researcher.
The companies started using 34nm technology in their flash memory chip factories in May 2008. The march to 25nm took about a year and a half.
Samsung on Friday noted strong demand for embedded flash memory products used in smartphones and other devices during its fourth quarter investors' conference. The company believes there will be limited flash memory supply increases because a number of memory chip makers were hurt by the recession and have not been able to build new factories nor upgrade old chip lines to the latest technologies.

Available IPv4 addresses dwindle below 10 per cent

~The Number Resource Organization (NRO), the official representative of the five Regional Internet Registries, made the announcement~

The long-awaited depletion of the Internet's primary address space came one step closer to reality on Tuesday with the announcement that fewer than 10 per cent of IPv4 addresses remain unallocated.

The Number Resource Organization (NRO), the official representative of the five Regional Internet Registries, made the announcement. The Regional Internet Registries allocate blocks of IP addresses to ISPs and other network operators.

The NRO is urging Internet stakeholders — including corporations, government agencies, ISPs, IT vendors and users — to take immediate action and begin deploying the next-generation Internet Protocol known as IPv6, which has vastly more address space than today's IPv4.

"This is a key milestone in the growth and development of the global Internet," said Axel Pawlik, Chairman of the NRO, in a statement. "With less than 10 per cent of the entire IPv4 address range still available for allocation to RIRs, it is vital that the Internet community take considered and determined action to ensure the global adoption of IPv6."

IPv4 is the Internet's main communications protocol. IPv4 uses 32-bit addresses and can support around 4 billion IP addresses.

Designed as an upgrade to IPv4, IPv6 uses a 128-bit addressing scheme and can support so many billions of IP addresses that the number is too big for most non-techies to understand. (IPv6 supports 2 to the 128th power of IP addresses.)IPv6 has been available since the mid-1990s, but deployment of IPv6 began in earnest last year.

The NRO recommends IPv6 as a way of ensuring that the Internet can support billions of additional people and devices.The NRO recommends the following actions:

* Businesses should provide IPv6-capable services and platforms.

* Software and hardware vendors should sell products that support IPv6.

* Government agencies should provide IPv6-enabled content and services, encourage IPv6 deployment in their countries, and purchase IPv6-compliant hardware and software.

*Users should request IPv6 services from their ISPs and IT vendors.

NRO officials warned of "grave consequences in the very near future" if the Internet community fails to recognize the rapid depletion of IPv4 addresses.