Laliwala IT Services

Laliwala IT Services
Website Development

Tuesday, May 21, 2013

China to pressure it to cool its cyber-attacks on U.S. targets


For the last three months or so, the U.S. government and some of its defense contractors have engaged in a war of shame on China to pressure it to cool its cyber-attacks on U.S. targets. The campaign appeared to be yielding results, but it seems that Chinese hackers were only catching their breath.
The notorious Unit 61398, also known as the “Comment Crew,”—an elite cyber unit linked by U.S. security firms to the China’s People’s Liberation Army (PLA)—has renewed its raids on U.S. entities using different techniques, the New York Times reported Sunday.
Cyber security firm Mandiant told the Times that the attacks had been renewed, but would not identify the targets—although it did acknowledge that many of them were the same ones assaulted earlier by the Chinese cyber unit.
Mandiant did not respond to a request for comment for this story.

Background

Mandiant released a report in February that kicked off the shame campaign against China. In it, it tied Unit 61398 to cyber-attacks on 141 companies—87 percent of them have headquarters in English-speaking countries and against companies that work in 20 industries considered strategic by China.
Immediately following the report’s release, China repudiated the document, maintaining it was based on flawed evidence.
Nevertheless, the attacks began to abate after the report’s release, and the hackers removed their spy tools from the organizations they had infiltrated, according to Mandiant.
Over the past two months, however, Mandiant found an uptick in infiltration activity aimed at the same companies but originating from different servers.
Activity now is about 60 to 70 percent of what it was before the hiatus began in February, Mandiant estimated.

Not a good strategy?

The shame campaign was a dubious strategy, asserted Jeffrey Carr, CEO of Taia Global and author of Inside Cyber Warfare: Mapping the Cyber Underworld.
”It’s a terrible idea,” he told PCWorld.
Shame, as a diplomatic tool, doesn’t seem to work however it is used. “We’ve tried to use it to shame North Korea into behaving itself and obviously that hasn’t worked,” Richard Stiennon, chief research analyst at IT-Harvest, told PCWorld.
Carr said that the U.S. government needs to cooperate and collaborate with China to pursue criminal groups engaging in intellectual property theft.
”You’re not going to stop a government from engaging in espionage, so that should just be off the table,” he said.

What might work

By collaborating with China to attack groups operating within its border or commandeering its computers from outside its borders for criminal espionage activity, much data theft could be stopped, Carr said.
”The New York Times and Mandiant have collaborated on this theory that Comment Crew is part of the PLA,” Carr added. “Mandiant has never established that. It just made the claim that it is.”
Another way to counter cyber threats from China is to make it more expensive for the hackers to get the information they want, added Stiennon.
”Right now it’s very inexpensive to engage in these cyber-attacks,” he said.
”Mandiant’s report slowed them down, forced them to retrench, pull their tools out, and reengage,” he continued. “They spent a lot of man hours because of that report.”
”That reaction was expensive for the attackers,” he added. 

charge your phone in less than 30 seconds - won $50,000


  An 18-year-old recently won $50,000 in scholarship funds for inventing a supercapacitor that could one day be used to fully charge a mobile device like a smartphone in just a few seconds. Eesha Khare and two other teens were among the top winners at the annual Intel International Science and Engineering Fair.
Her design, a tiny device that fits inside cell phone batteries, would allow them to fully charge within 20-30 seconds. The supercapacitor can last for up to 10,000 cycles which outpaces traditional batteries by a factor of 10. Intel said the invention also has potential applications for car batteries but it’s the mobile side that could have the most immediate potential.








She is one of two recipients of the Intel Foundation Young Scientists Award this year. The other winner, Henry Lin, created a model that simulates thousands of galaxies. The Gordon E. Moore Award, which honors the best of the best, went to Ionut Budisteanu who created an AI model that could eventually lead to cheaper self-driving vehicles. The 19-year-old earned $75,000 in scholarship money for his efforts.
Khare’s invention has only been used to light up an LED thus far but it was reportedly able to do a great job at it. With any luck, Khare will continue to develop the technology and bring it up to scale where it could be used inside future portable devices. After all, battery life in portable electronics is still a pretty big problem for most people – especially heavy users.

Google Checkout payment processing tool to shut down in November 13


Google will retire its Checkout payment processing tool on Nov. 20, and warned retailers they will need to move to a different payment processing platform.
Checkout, which launched in 2006, was merged with Wallet, which is a mobile payments tool, last November. The product was aimed at taking on eBay's PayPal service, which dominates Web-based payments.
Google said it has partnered with Braintree, Shopify and Freshbooks to offer discounted options for retailers that have not yet selected an alternative payment processor.
Wallet will now be the company's focus. Developers for its Play store will be moved to the Google Wallet Merchant Center, Justin Lawyer, senior product manager for Google Wallet, wrote. There will be no changes for consumers using Wallet on sites such as Priceline and Uber, according to Lawyer.
U.S. merchants who do have a new payment processor can apply for Wallet's "Instant Buy" feature, Lawyer wrote. Instant Buy, which was formerly known as "Google Wallet for Online Commerce," allows consumers to share their payment details with merchants.






Wallet stores users payment details and is also capable of performing contactless payments with NFC-enabled Android devices.
Last week, Google announced it had integrated Gmail with Wallet, allowing U.S. users to send money to each other. Transfers made from a bank account linked with Google Wallet are free.

12 Benefits Of Having A Website


Some of the goals that can be achieved by launching a website include the following:

1. Far Cheaper and Much More Flexible Than Print Advertising
The Internet is extremely different from print advertising in that space is cheap, your advertisement is accessible for a longer period of time, the content can be changed without having to ask someone to do it for you (if you use a content management system) and you can potentially reach a wider audience.

This is not to say that you should not use other forms of advertising at all   You can use it to entice people to visit your website and find out about your company and potentially open two-way communication between the potential customer and a sales person.

2. Market Expansion
The Internet has allowed businesses to break through the geographical barriers and become accessible, virtually, from any country in the world by a potential customer that has Internet access.

3. Diversify Revenue Streams
A website is not just a medium for representation of your company, it is a form of media from which everybody can acquire information. You can use this media to sell advertising space to other businesses.

A recent trend has risen where businesses feature their very own directory of complimentary services, where the visitor can search for information on a business that will enhance the use of your service. The business sells complimentary businesses a listing in their directory. 
A good example is a catering company featuring a directory with businesses such as event co-coordinators, electronic equipment rental companies, etc.

4. 24 7 365
No more turning customers away when its time to close shop, putting up a note saying  closed for public holiday , or leaving an irritating message on your answering service specifying your trading hours   tell them to visit your website for information they are looking for.

5. Offer Convenience
It is far more convenient for a person to research a product on the Internet than it is to get in a car, drive somewhere and look for or ask someone for information on a product. Also, a potential customer won t have to judge a call centre agent to determine whether he/she has their best interests in mind, or just wants to make a sale.

The potential customer can visit your website whenever they like in their own privacy and comfort, without the stresses and distractions that exist in the  real world .
Your website is a self-service medium   for example, instead of having to wait in a long cue to pay your TV License, you can now do it electronically through the TV License website.

6. Add Value and Satisfaction
By offering convenience, a point of reference and that touch of individualized customer service, you ultimately add value to your offering and your customers experience a higher level of satisfaction.

Your website can add value in other ways too, by featuring tips, advice and general interest content you can  entertain  your customers. This will also help them remember you better.

7. Standardize Sales Performance
By looking at which approached / pitches have worked in the past and those which have not, you can produce the ultimate pitch and use it with your website, so that you use it on every customer. No more training of sales people and waiting for them to get a feel for your line of trade.

8. Improve credibility
A website gives you the opportunity to tell potential customers what you are about and why you deserve their trust and confidence. In fact, many people use the internet for pre-purchase research so that they can determine for themselves whether a particular supplier or brand is worthy of their patronage, and won t take them for a ride.
The Internet also allows for Viral Marketing   where your website visitors spread positive word-of-mouth about your business - your customers do your marketing!

9. Promote your  Brick  n  Mortar  Presence
Getting lost trying to find a place can be frustrating for a potential customer. You can publish what they call a  dummy map  on your website, which shows directions and landmarks graphically, and the potential customer can print it out when looking for your  Brick  n  Mortar  premises.

You might advertise a promotion on your website encouraging the visitor to visit your  Brick  n  Mortar  premises (e.g..  At a branch near you! ).
Also, if you recently moved to a new location, you will have to wait for the next 'phone directory to come out before people figure out where you currently are. Because a website is flexible   you can change the content as you like   you can change you contact details instantly and lower the risk of losing customers when moving to a new location.

10. Growth Opportunity
A website serves as a great place to refer potential investors to, to show them what your company is about, what it has achieved and what it can achieve in future.

11. Two-Way Communicative Marketing
Customers can quickly and easily give feedback on your product and/or marketing approach.

12. Cheap Market Research
You can use features on your website such as visitor polls, online surveys and your website statistics to find out what your customers like more and how they feel about certain aspects of your business to determine how you can improve your product and the way you do business.

Monday, May 20, 2013

Do's and Don'ts on LinkedIn

1. Share links (using a URL shortener) to interesting articles, websites or video you have found that some individuals in your network might appreciate. Don't worry about whether all of your connections will find the information equally valuable. Also, try to use words that grab the readers and encourage them to click the link.

2. Pose a question that could lead to solving a problem you have, like: "Anyone know any good controller candidates?" One of my connections saved $20,000 in recruiting fees by posting an update like this a couple days before calling a recruiter. (I apologize to all recruiters for having to mention this situation.)

3. Conduct an informal poll of your network (which consists of many smart businesspeople) relating to a topic that is of interest to you, such as: "What interest rates are you seeing for lines of credit in the current environment?"

4. Mention a person or a situation that might be helpful to some of your connections, like: "I just met with John Jones from ABC Insurance Company and found out they are saving companies lots of $$ on workmen's compensation insurance."

5. Talk about an event you are attending or have attended to encourage involvement and/or questions about what you learned there.

6. If you are a job seeker, don't use this to say, "Hey, I'm still looking for a job." Rather, mention job fairs you are attending, people you are interviewing with, networking events you are going to, etc.

7. Use the "Like" feature when you see a helpful update from one of your connections. Doing this shares that update with your entire network.

DON'T DO THIS:
1. Mentioning personal things--like what you had for breakfast and the fact your dog is sick today--is just wrong. This suggests to the business professionals in your network that you don't really respect their time.

2. Continually talking about specific products and services takes people back to the days of big newspaper ads and screaming radio messages. This is not the purpose of social media, especially LinkedIn.

3. Avoid talking about topics that might be sensitive to some of your audience. I am too embarrassed to even think about, let alone share, some of the items I see posted as status updates. You know what I mean. If your mother wouldn't want you talking about it, don't put it in your LinkedIn Status Box.

4. Think twice before posting your physical whereabouts. I have heard several real-life examples of people's homes being broken into after putting out an "I-am-out-of-town" update on Twitter. Sorry, all you Foursquare users, but I had to share that.

5. The LinkedIn/Twitter interface is causing people to have too many LinkedIn updates as well as inappropriate updates. So, if you are using that interface, be selective about the updates you share between the two platforms. LinkedIn and Twitter are designed with different purposes and strategies.

6. The netiquette on LinkedIn is no more than a couple updates per day, whereas on Twitter you are almost expected to tweet twenty times per day. (I apologize to my Twitter followers for not getting out twenty per day!) So, watch the frequency of your LinkedIn status updates.

7. Don't waste your time reading updates from people who violate all of the above. By using the "Hide" function, you can stop an individual's status updates from showing up on your home page.

7 outsourcing nightmares in IT business.



Outsourcing IT functions can be a smart business move, particularly if your organization lacks specific expertise. IT infrastructure, networking, application development, help desk -- plenty of high-quality service providers are available to fulfill your IT needs.
But like other major business and technology initiatives, outsourcing comes with risks, regardless of how experienced the outsourcing provider is or how good the move looked initially.

Outsourcer employee turnover, communication breakdowns, shortsighted contracts: They can all sink an arrangement, resulting in lost opportunities, downtime, or worse. In the interest of forewarned is forearmed, here are seven real-life examples of what can go wrong with an outsourcing initiative - and how to avoid or resolve these outsourcing arrangements gone amok.

Outsourcing nightmare No 1: Outsourcing employee exodus

Several years ago, Coalition (Technologies had a project for an important client that it sent to an outsourcing partner to complete. The Web design and marketing firm had worked with the outsourcing partner before, and the experience had been positive. The partner had been responsive and provided a high level of quality and communication, says Joel Gross, founder and CEO of Coalition.
"Everything seemed to be moving along fine, until the project neared its completion date," Gross says. Then the outsourcing company's CEO contacted Coalition to report that more than half of the company's staff had quit.
"They did not have the capability to complete the project," Gross says. "As a result, we had to scramble and find a way to resolve [the problem] internally on extremely short notice."
While Coalition was able to deliver the work without too much added delay, it learned a valuable lesson about the risks of outsourcing. Now, the company tries to keep all of its critical IT work in-house, relying on a dedicated, handpicked (team.
When technology projects pile up, Coalition does contract outside providers to perform basic tasks, Gross says. It might sound obvious, but including every possible contingency in the contract is vital.
"Avoiding contracting nightmares is possible; you just have to lay the (ground rules," he says. "In order to ensure the quality and standard of work, we have a strict and explicit contract that must be signed."
Payment schedules and consequences for late or bug-prone work are central components of those contracts.( Contractors receive 25 percent of cost funded upfront, another 25 percent upon beta (completion, and the remaining 50 percent when the project is complete and has been( certified bug-free by Coalition project managers.

The U.S. Citizenship and Immigration Services received roughly 50,000 "packages" with H-1B petitions


WASHINGTON - The U.S. Citizenship and Immigration Services received roughly 50,000 "packages" with H-1B petitions on Monday, the first day of filing for the next fiscal year.
Based on historical patterns, each package represents about 1.2 H-1B petitions. A package can contain anywhere from one petition to several hundred.
The data on H-1B petitions comes from FCi Federal, a Leesburg, Va.-based government services and technology provider that is supplying personnel to assist the USCIS in processing the H-1B petitions.
FCi's estimate confirms predictions that H-1B demand will be at its highest level since 2008, the last time the petitions exceeded H-1B visa caps.
If the the cap is exceeded this year, which now appears likely, the federal government will distribute H-1B visas via a lottery.
The USCIS will stop accepting H-1B petitions once it reaches its two visa caps -- a general 65,000-visa cap and a 20,000 limit on visas for holders of advanced degrees from U.S. universities. But for the purpose of calculating the total visa count against the caps, the government treats the first five days of April as essentially one day.
FCi's estimate means that the USCIS received somewhere in the range of 60,000 petitions on day one of the filing. The number is not official -- it's clearly an estimate. The USCIS does not disclose how many petitions it has received until after the first five days have passed.
In 2008, the U.S. received some 163,000 petitions in the first five days.
The number of packages received fell sharply on Tuesday, said an FCi official, but did not estimate the total.
The large number of petitions received on day one was expected.
The USCIS, in a March 15 press release, said that based on feedback received from "stakeholders," which would include immigration attorneys who prepare petitions, it was possible that the H-1B cap would be met in the first five business days of the filing season.
FCi has 800 employees working at USCIS processing centers, but it has had to hire more than 100 temporary workers to help handle the workload.


_______________________________________________________________________________




"I really see Hadoop becoming the kernel of the mainstream data processing system that businesses will be using," he adds.




Hadoop is not well-suited to the online, interactive data processing required for truly real-time data insights. Or is it?



Apache Hadoop, the open source software framework at the heart of big data, is a batch computing engine. It is not well-suited to the online, interactive data processing required for truly real-time data insights. Or is it? Doug Cutting, creator of Hadoop and founder of the Apache Hadoop Project (and chief architect at Cloudera) says he believes Hadoop has a future beyond batch.
"I think batch has its place," Cutting says. "If you're moving bulk amounts of data and you need to really analyze everything, that's not about interactive. But the combination of batch and online computation is what I think people will really appreciate."


"I really see Hadoop becoming the kernel of the mainstream data processing system that businesses will be using," he adds.

Where Hadoop stands now

Speaking at the O'Reilly Strata Conference + Hadoop World in New York City, Cutting explains his thoughts on the core themes of the Hadoop stack and where it's heading.
"Hadoop is known as a batch computing engine and indeed that's where we started, with MapReduce," Cutting says. "MapReduce is a wonderful tool. It's a simple programming metaphor that has found many applications. There are books on how to implement a variety of algorithms on MapReduce."
MapReduce is a programming model, designed by Google for batch processing massive datasets in parallel using distributed computing. MapReduce takes an input and breaks it down into many smaller sub-problems, which are distributed to nodes to process in parallel. It then reassembles the answers to those sub-problems to form the output.
"It's also very efficient," Cutting says. "It permits you to move your computation to your data, so you're not copying data around as you're processing it. It also forms a shared platform. Building a distributed system is a complicated process, not something you can do overnight. So we don't want to have to re-implement it again and again. MapReduce has proved itself a solid foundation. We've seen the development of many tools on top of it such as Pig and Hive."
"But, of course, this platform is not just for batch computing," he adds. "It's a much more general platform, I believe."

Defining characteristics of the Hadoop platform

To illustrate this, Cutting lays out what he considers the two core themes of Hadoop as it exists today, together with a few other things that he considers matters of "style."
First and foremost, he says, the Hadoop platform is defined by its scalability. It works just fine on small datasets stored in-memory, but is capable of scaling massively to handle huge datasets.
"A big component of scalability that we don't hear a lot talked about is affordability," he says. "We run on commodity hardware because it allows you to scale further. If you can buy 10 times the amount of storage per dollar, then you can store 10 times the amount of data per dollar. So affordability is key, and that's why we use commodity hardware, because it is the most affordable platform."
Just as important, he notes, Hadoop is open source.
"Similarly, open source software is very affordable," he adds. "The core platform that folks develop their applications against is free. You may pay vendors, but you pay vendors for the value they deliver, you don't keep paying them year after year even though you're not getting anything fundamentally new from them. Vendors need to earn your trust and earn your confidence by providing you with value over time."
Beyond that, he says, there are what he considers elements of Hadoop's style.
"There's this notion that you don't need to constrain your data with a strict schema at the time you load it," he says. "Rather, you can afford to save your data in a raw form and then, as you use it, project it to various schemas. We call this schema on read.
Another popular theme in the big data space is that oftentimes simply having more data is a better way to understand your problem than to have a more clever algorithm. It's often better to spend more time gathering data than to fine-tune your algorithm on a smaller data set. Intuitively, this is much like having a higher-resolution image. If you're going to try to analyse it, you'd rather zoom in on the high-resolution image than the low-resolution image."

HBase is an example of online computing in Hadoop

Batch processing, he notes, is not a defining characteristic of Hadoop. As proof he points to Apache HBase, the highly successful open source, nonrelational distributed database-modeled on Google's BigTable-that is part of the Hadoop stack. HBase is an online computing system, not a batch computing system.
"It performs interactive puts and gets of individual values," Cutting explains. "But it also supports batch. It shares storage with HDFS and with every other component of the stack. And I think that's really what's led to its popularity. It's integrated into the rest of the system. It's not a separate system on the side that you need to move data in and out of. It can share other aspects of the stack: It can share availability, security, disaster recovery. There's a lot of room to permit folks to only have one copy of their data and only one installation of this technology stack."

Looking ahead to the Hadoop holy grail

But if Hadoop is not defined by batch, if it is going to be a more general data processing platform, what will it look like and how will it get there?
"I think there are a number of things we'd like to see in the sort of "Holy Grail" big data system," Cutting says. "Of course we want it to be open source, running on commodity hardware. We also want to see linear scaling: If you need to store ten times the data, you'd like to just buy ten times the hardware and have that work automatically, no matter how big your dataset gets.
Similarly with performance, Cutting says, for both batch performance if you need greater batch throughput or short, smaller batch latency, you'd like to increase the amount of hardware. As for interactive queries, the same thing holds. Increased hardware should give you linear scalability in both performance and magnitude of data process."
"There are other things we'd like to see," he adds. "We'd like to see complex transactions, joins, a lot of technologies which this platform has lacked. I think, classically, folks have believed that they weren't ever going to be present in this platform, that when you adopted a big data platform, you were giving up certain things. I don't think that's the case. I think there's very little that we're going to have to need to give up in the long term."

Google provided a map

The reason, Cutting says, is that Google has shown the way to establish these elements in the Hadoop stack.
"Google has given us a map," he says. "We know where we're going. They started out publishing their GFS and MapReduce papers, which we quickly cloned in the Hadoop Project. Through the years, Google has produced a succession of publications that have in many ways inspired the open source stack. The Sawzall system was a precursor to Pig and Hive; BigTable directly inspired HBase, and so on. And I was very excited to see this year Google publish a paper called Spanner about a system that implements transactions in a distributed system-multitable transactions running on a database at a global scale. This is something that I think a lot of us didn't think we'd see anytime soon, and it really helps us to see that the sky's the limit for this platform."
Spanner, Cutting notes, is complicated technology and no one should expect to see it as part of Hadoop next spring. But it provides a route to the Holy Grail, he says. In the meantime, he points to Impala, a new database engine released by Cloudera at the conference this week, which can query datasets stored in HBase using SQL.
"Impala is a huge step down this path toward the Holy Grail," he says. "Now, no longer can you [only] do online puts and gets of values, you can do online queries interactively with Impala. And Impala follows some work from Google, again, that was published a few years ago, and it's very exciting. It's a fundamental new capability in this platform that I think is a tremendously valuable step on its own and will help you build more and better applications on this platform. But also I think it helps to make this point, that this platform isn't a niche. It isn't a one-point technology. It's a general purpose platform."
We know where we're going with it," Cutting says, "and moreover we know how to get there in many cases. So I encourage you to be comfortable adopting it now and know that you can expect more in it tomorrow. We're going to keep this thing advancing."

SAP is adopting more open source software


Although not traditionally known for its contributions to the open source community, the German-based SAP is adopting more open source software, as well as contributing more of its own code back into the community, company officials said in an interview.
"In the past we didn't have an open source strategy," said Claus von Riegen, SAP's program director of technology standards and open source. "That has changed over the last two years or so."
In 2005, Shai Agassi, then the SAP executive in charge of the company's product group, expressed ambivalence over using open source software. In the years since, however, the company has warmed to the idea. Certainly, SAP's chief rival Oracle, for instance, is an active, if controversial, supporter and sponsor of many open-source software projects.
In 2007, SAP began contributing significantly to the Eclipse project, and in October 2009, the company joined the Apache Software Foundation. In 2009, SAP contributed 1.8 million lines to the Eclipse project, making it the third-largest corporate contributor.
While SAP should not be considered an "open-source company" in the same way as say, Red Hat, the company nonetheless "represents a good case study on how proprietary companies have learned that it is in their best interests to contribute to open source software projects," wrote 451 Group enterprise software analyst Matthew Aslett in a review note.
For SAP, using open source has become "a matter of development productivity," von Riegen said. "We have a lot of areas where we develop our own software, but there are a lot of commodity areas where we don't need to differentiate ourselves -- that's where we want to more efficiently use existing software, like open source," he said.
In these cases, it makes sense to use the open-source application, saving the time and cost to develop the identical functionality in-house. Now the company uses more than 100 open-source applications developed outside of SAP.
In order to use all of this externally generated code, SAP has standardized the way it manages its use of open-source software. Using a program called Code Center, offered by Black Duck Software as part of its Black Duck Suite, von Riegen's office runs a company-wide registry of which open-source applications have already been approved by SAP for use within its products. It also specifies which versions of these applications have been approved, which streamlines the maintenance process for the company.
This centralized approach helps the company deal with licensing issues, said Janaka Bohr, SAP's head of global licensing for open source. Before any software is approved, the company's lawyers must check the license to ensure it does not conflict with the company's plans for the product. The centralized approach cuts down on the number of times a lawyer has to check a license and reduces the amount of due diligence work a development team must do.
"In the past our developers had to spend a few hours researching an open-source product to find the licenses, to find the technical information," Bohr said.
The Black Duck software also includes a library for scanning code to unveil what open-source code is embedded within other applications. SAP doesn't want to inherit, say, a GPL violation, which could force the company to open source the entire program that uses a snippet of GPL code.
The ability to review code has also been crucial in helping SAP in its process of acquiring other companies. Even if SAP didn't use open-source software, it would still have to grapple with all the open-source software used by the companies it acquires. Overall, in 15 acquisitions since 2007 (not including Sybase), the company has had to examine 2,000 different software programs.
On Friday, SAP announced that it has finalsed its US$5.8 billion purchase of Sybase. Although Sybase will continue to operate as a separate company, SAP has still inherited a lot of code in the purchase.
While von Riegen would not comment on the Sybase acquisition specifically, he did say, in general, SAP invests a lot of effort in understanding what code it is acquiring as part of any potential sale.
Although SAP engineers typically are not allowed to review the code of a company that it intends to purchase, the Black Duck software can be used by a third party to scan the software and return a list of what open-source code has been found.
This activity has been tremendously helpful, von Riegen said. It allows SAP to get a handle on the code base of the company it intends to acquire. In one case, a company that it had acquired had claimed to be using no open-source code, when, in fact, it had embedded more than 80 open-source applications within its own programs.
"Some of the acquisition targets claim that they don't use open source, but when you scan you find quite a lot of open-source code," he said. In at least one case, a planned acquisition fell through because the review of the code base revealed far more open source was being used than the takeover prospect had claimed.

Open source technology really work for business?


Free, community-supported versions are fine for testing or non-critical needs, but when the work is mission-critical, users say they are more likely to pay for enterprise versions of open-source applications.

Jeremy Cole, a co-founder of MySQL consulting vendor Proven Scaling LLC, says that sometimes this split development model can cause unintended problems. One issue, he said, is that businesses that need to rely on stable, mature code aren't always getting what they pay for.
At MySQL, Cole says, "they release the enterprise version more often than the community version". What that means is that "while enterprise users are getting fixes faster, they're essentially running untested code", he said.
Others share the concerns. Such issues are growing in importance as more large companies buy open-source companies, adding a boost to open-source software in enterprise systems. Sun Micro systems' recent acquisition of open-source database vendor MySQL AB is the most recent evidence of this trend.
The beauties of open source Bill Parducci, CTO of Think Passenger, which builds online communities for companies and their customers, says open-source code is important to his young company because it lowers technology costs and allows customization of key source code.
"The concept of an organisation pushing out the code faster so their clients can get the code faster, I don't agree with that," Parducci says. "Customers can't keep up." Because of such pressures, Linux vendor Red Hat doubled the length of its new version cycles several years ago to better meet the needs of its customers, he said.
"Software is more stable and supportable when [new versions are] less frequent. There's no value in software that doesn't work predictably."
Parducci says he is seeing more examples of software that takes a "hybrid approach" between open source, closed source, functionality, risk and support. "At the end of the day, you need to solve a problem," he said. "I think we're finally over the day of people running up the hill with a flag of open source or a flag of anti-open source."
Think Passenger uses a host of open-source applications, including Red Hat Enterprise Linux, CentOS Linux, Iona Technologies' Fuse Message Broker, Jetty Web server and Terracotta's network-attached memory applications.
Parducci said he uses the paid enterprise versions of most applications so he can get expert support and the most stable code. With Iona, "they take it, they stabilise the releases, they package it together and put support around it," he said. "It's the same basic code as the community version with support and stabilisation. It's working out well for us."
Parducci said he looks at whether a prospective open-source vendor is trying to upsell to a proprietary version of its product or whether a proprietary version is needed to maintain full functionality with other products. "To me, that really becomes a red flag," he said.
"Are they supporting the open-source stuff just to sell me up to the other side?" Working with most open-source vendors has been satisfactory, he said, but there is room for improvement, particularly among the smaller vendors. Such vendors need to ensure "timely feedback and improved communities" so that business users can get the help they require, he said.

Enterprise versions worth the cost Justin King, a systems administrator for the Human Neuroimaging Laboratory at Baylor College of Medicine in Texas, said he's found that community versions of open-source applications are adequate for his needs, but that buying enterprise versions often saves time because they are more developed and include useful administrative features.
 

 King said he uses open-source applications from Red Hat, web infrastructure management vendor Hyperic and others. "In the enterprise versions, in most cases, the main thing is stability," he says. "You can live without having certain [new and improved] features. The absolutely most critical thing is uptime and stability."
"The best model to look at is Red Hat," King says. "They've got [the community supported] Fedora [version of Linux] and it changes frequently. Then there's Red Hat Enterprise Linux that's stable and supported [for enterprise users]. That's the correct model of enterprise open source as far as I'm concerned."
For mission-critical business users, "nobody in their right mind is going to rely on something" that doesn't have adequate support and stable releases, King continues. "They'll go with supported versions if it exists to run their business. At the end of the day, if something's broken and nobody on-site can figure it out... it's cheaper to call the support guy and choke him until he figures it out."
Gautam Guliani, the executive director of software architecture at New York-based Kaplan Test Prep and Admissions, a college entrance exam testing company, says he prefers to buy enterprise versions of all open-source applications used in mission-critical roles.
Using community-based applications in pilot projects and non-critical business functions is acceptable, he said, but if his company wants to use it, it will pay for the enterprise version to get the support.
More road-map direction
Kaplan uses a small assortment of open-source applications, including JBoss middleware, Red Hat Linux and Alfresco web content management software. Getting adequate and timely support hasn't been a problem in general, Guliani says, but getting future road map direction from open-source vendors can be tougher than with proprietary vendors.
"The development road map is not as thought out as much sometimes as we'd like with open-source companies," he says. "Some do it well, but for most there is room for improvement."
What open-source vendors offer to his business, he says, is lower costs for support, deepening maturity, code flexibility, "a much deeper level of transparency into the software products," and a higher rate of innovation.
"The releases tend to come more frequently" with open-source vendors, he says. "If they come too often, it can be a problem. At least if they're coming often, we can choose not to upgrade to a new release. Most open-source vendors have realised that if they bring out a new version, that they shouldn't drop support for the old one too fast."
What's happened, say analysts, is that open-source software has quietly become an integral part of corporate IT, whether through community-based or enterprise versions.
Raven Zachary, a 451 Group analyst, says companies don't even look at software as being open source or proprietary, but analyse it based on what will work best for them.
"I don't run into enterprises very often that would be willing to give up functionality," he says. "Enterprises are going to purchase technology that will allow them to do their jobs. Sometimes that means proprietary. Sometimes that means open source. Generally, large enterprises are going to make decisions about what is right for them regardless of whether it's open source or proprietary, based on value."
Donald DePalma, an analyst at Common Sense Advisory, says business users with large data centers are typically using enterprise versions of open-source applications because of their mission-critical requirements. "Individual rogue business units are using community-supported versions," he says.
"There are levels of open-source use," DePalma said. "MySQL is so widespread in use that it seems almost Oracle-like in its commercial viability, so users don't even see a distinction. I think we'll see more of this moving forward."

West Texas A&M University wanted to develop a single sign-on portal


When West Texas A&M University wanted to develop a single sign-on portal for its 8,000 students that would unify its Web applications, student resources and social networking services, a steering committee came up with a list of six criteria for evaluating available software. They would compare software systems' features, mobility, single sign-on capabilities, look and feel, and flexibility, as well as their ability to integrate with existing Web applications.
But this wasn't an apples-to-apples comparison. CIO James Webb threw in a pair of open-source projects to be considered alongside commercial software packages. While it was easy to compare the systems on many of the criteria (the open-source pair won in all six categories), the committee had to add another question: How strong is the open-source user community, and could it help the university achieve its goals? The answer was yes, and the Canyon, Texas-based school chose the two open-source tools: uPortal, an architecture based on Java and XML, which also included support for mobile devices, and Jasig's Central Authentication Service (CAS) for its single sign-on service.

"One of the main reasons we went with the uPortal open-source solution is that Yale, Rutgers and the University of Wisconsin-Madison are the major developers. So I guess you could say it was built by higher ed for higher ed," says Webb. "We know we have an ecosystem of great universities that are contributing to the open-source initiative, supporting it and providing additional features to keep this product innovative."
Open source is the new X factor in software selection. More than 50% of all software purchased will be open source by 2017, according to a 2012 survey of 740 enterprises released by a collaboration of 26 open-source companies. That finding signals a tipping point for open-source software adoption in the enterprise and nontechnical fields such as the automotive, healthcare and financial services industries.
Choosing the right open-source offering could be critical to an organization's success. But evaluating an open-source project holds more caveats and pitfalls than picking traditional software. IT departments must consider the culture of the open-source community, the quality and timeliness of releases, the project's governance model and the availability of support. They also have to consider whether, and to what degree, they're willing to contribute code and fixes back to the community.
Here, organizations that have successfully adopted open-source systems share the criteria they used to evaluate projects and their philosophy about giving back to the open-source community.
'Projects' vs. 'Products'
Many IT departments evaluate open-source systems the same way they assess commercial products. They look for tools that offer superior functionality and lower maintenance and support costs. Many also turn to open source to escape vendor lock-in, foster sustainability within the IT infrastructure and spur innovation in IT operations.
But there are other things to consider when looking at open-source systems, such as the culture of the community, the consistency of the product's quality, and how quickly the community responds when security fixes and patches are needed.
"It's important to evaluate smaller, open-source projects differently than larger, corporate-sponsored open-source products," says Tomas Nystrom, a senior director and global lead for open source at Accenture.
There are hundreds of thousands of small open-source projects or libraries, such as NAS and Spring, that rely heavily on user communities. Then there are open-source products, such as Red Hat Linux, which are managed by, and often owned by, companies that are in the business of selling software.
Sprint Nextel decided that a well-established product would best meet its needs when it ventured cautiously into open source, having grown tired of paying vendors millions of dollars in maintenance fees for Web and application server software, even as the need for support declined.
"We had built an internal team who was responsible for the Web and apps servers, and we believed we could move to an open-source product and still be successful," recalls Alan Krause, director of enterprise application integration at Sprint. But going it alone was a scary proposition for the CIO and a vice president, who both wanted the security of having a vendor to lean on if problems arose.
"There really was some trepidation there," Krause recalls. So the organization chose JBoss Enterprise Application Platform as its new middleware and Red Hat Enterprise Linux as its new operating system. It also used Red Hat's consulting team to help with implementation and let a Red Hat relationship manager serve as liaison with the open-source community.
"We're kind of dipping our toe into open source," Krause says. "We're still paying some maintenance for it, but it's significantly cheaper than what we were paying before."
When looking at open-source products like Red Hat, the selection criteria are no different from those that apply to commercial software, Nystrom says. "They're considered to be normal vendors with high-quality products that are comparatively cheap."
As open-source products gain traction at companies like Sprint Nextel, IT departments will feel more comfortable turning to smaller, open-source projects to foster innovation, Nystrom says. "If you're building something custom, it's typical that you use [open source] somewhere during development," he says. "It's almost impossible not to use it if you want to build a very modern application."
In such cases, Nystrom recommends a bottom-up approach for choosing open-source projects.
"Developers and architects know what the communities are like and which are the libraries that are in much use today," Nystrom says. "They have a clearer view of which library we should use for which purpose, or which version of some type of persistent API we should be using here, or what's the best log-in library. So you can narrow down the number of libraries that are relevant for the enterprise very quickly -- from hundreds of thousands to probably less than 100, depending on what you want to build." And from there it's a quick move to a few "usual suspects," he adds.
West Texas A&M chose the CAS project for its single sign-on system because CAS had been successfully deployed at Texas A&M University in College Station "and the references were solid," Webb says. His team also attended user events and higher-education conferences related to CAS as part of the decision-making process.


Open Source Gives Back
Several nonprofit open-source organizations now help companies give back to the community by providing their programmers with opportunities to volunteer their time and talents to benefit social causes.
Through the work of nonprofit organizations such as Benetech, FrontlineSMS, The Guardian Project, Mozilla Webmaker and Wikimedia Foundation, so-called humanitarian free and open-source software has emerged as an important tool in tackling global social challenges, including civic engagement, disaster relief, education, healthcare and human rights.
Several tech companies already connect their technologists with opportunities to contribute their skills to projects that benefit social causes -- as VMware does through its #ContributingCode initiative, for example. But any company can get involved in such initiatives. One source of information about these efforts is SocialCoding4Good, which is running a pilot program with several nonprofit organizations that develop humanitarian free and open-source software.
What can companies and employees gain by giving back? Plenty, according to one of several nonprofit groups that organize open-source projects to improve the lives of people worldwide.
"It creates a tremendous professional development opportunity for employees," says Gerardo Capiel, vice president of engineering at Benetech, which sponsors open-source projects benefiting literacy and education, environmental conservation and human rights. Some programs leverage their company's existing technologies and can influence how they affect the world. Others let programmers choose their own cause from a list of nonprofits.
Contributing to social change can have an impact on employees, as well. Programmer Abhi Mahule was looking to donate his skills and time to a cause when he learned about Benetech, which wanted to build an Android-based e-book reader for the visually impaired. Mahule took an existing open-source e-book reader and adapted a version for Android that could "read" books aloud as audio. He built a prototype, and Benetech secured funding from the U.S. Department of Education to bring it to market. Today, thousands of people use the app, Capiel says.
The project "helped me [hone] my technical skills," says Mahule, but adds that the intangible benefits were more significant. "It was a source of joy and a nice feeling that in a small way you're able to contribute," he says. "You should always look out for a larger cause for the greater good. This is the perfect opportunity for that."
- Stacy Collett
It Takes a Village
For many open-source projects, the developer community is the lifeblood of the software, and those who are new to open source should know that these communities all operate differently.
The well-established Linux community, for example, has operated under founder Linus Torvalds' "benevolent dictatorship" since its inception. But developers of new projects often keep tight control of their communities as well.
WibiData, a Hadoop-based user analytics company that helps organizations build big data applications, provides part of its software stack as open source to make it easier for developers to build big data applications on an HBase NoSQL database.
"Right now, 99.5% of the software is written by our own team," says Aaron Kimball, chief architect at WibiData. "It takes a relatively long time to get people to use it, and for every 50 people who use it, one might start helping to contribute."
Then there are the radically democratic models. Developers who donate a product to the Apache Software Foundation, for instance, must reach a "lazy consensus" with the community, which means "you need some number of individuals to give your idea a thumbs-up and for nobody to give it an explicit thumbs-down -- and if they do, they are obligated to work with you to make the changes," Kimball says. "It's designed to slow things down in some ways so all users can be invested in this and through consensus arrive at the best solution." Although the developers who participate most actively in writing source code are expected to be the ones who are listened to first, he adds.
Is It Better to Give Than to Receive?
IT departments might think that when they buy into open source they also have to actively participate in the community to ensure its survival. But that's not always the case.
With widely used open-source products like Red Hat, "[vendors are] very much in control of the community," Nystrom says. And while they do take from the community, "they still control the product," he adds. "They're not dependent on the community for the product to be stable and go forward."
Sprint Nextel currently relies on Red Hat consultants as its liaison with the open-source community, but Krause believes the company will need less hand-holding as time goes by. "We will eventually move away from Red Hat being our support system and work directly with the open-source community," he says.
For users of smaller open-source libraries or projects, communities are much more important.
"There's just a group of people who put this together, and there might not be a commercial entity behind it," Nystrom says. In these cases, developers are expected to contribute, but what if they refuse?
One open-source user says it's hard to contribute, or "pay it back," when the product is industry-specific.
When Hallmark Services Corp. (HSC) in Naperville, Ill., was overhauling its back-end systems, it bought a license for the open-source code of Healthation, a commercial off-the-shelf system for administrating healthcare business transactions.
Taking an open-source approach reduced the amount of labor required to complete the project, enabling HSC to finish more than nine months early and save $4.8 million in labor costs, according to Neal Kaderabek, CIO and vice president of financial services. HSC is a co-developer of the software with Lisle, Ill.-based Healthation, and it has the right to exclusive use of functionality that it developed -- it doesn't have to make it available as open source.
"We rarely check anything back in -- we just take it out, modify it and make it unique to our business," Kaderabek says, adding that HSC shares less than half of what it develops with the community. "Frankly, we think that sets us apart from our competitors, so why would we want to let the world share it?"
He acknowledges that Healthation was disappointed that HSC wasn't contributing to its open-source community. "Their view was that's what makes their product more attractive to the industry. But in this case, I just felt like it was our secret sauce," he says.
That's not often the case, industry-watchers say. Most open-source applications are essentially commodities, and the platform itself doesn't usually hold many trade secrets.
HSC processes $3.5 billion worth of insurance premiums annually and provides services to about 1.5 million retail insurance members.
The company chose Healthation because it was the only healthcare transaction software Kaderabek knew of that was available as open source. With Healthation, HSC could kick-start its IT transformation project because the majority of new core functions were already in place and the IT team had to customize only about one-third of the system.
"This [open source] out of the gate was leaps and bounds ahead of the design and architecture" of traditional software systems, Kaderabek says. "It was built on latest and greatest technology; it used Web services; it was .Net using SQL server -- which all met our standards. We got more done in a shorter period of time and didn't have to add extra resources," he says.
Kaderabek says that even when evaluating small or industry-specific open-source projects, IT shops should look for vendors that specialize in maintaining an open-source offering. "Make sure there's somebody out there who can say, 'I've done this for the last five years, and I know people who have done what you're doing,' in case you need help," he says.
When It's OK to Give It Away
Contributions to an open-source community don't have to be huge to be valuable. "If there's a low-level feature that's a more convenient way to do something -- that saves everybody time," says WibiData's Kimball. "Sometimes even small changes that may not take more than an afternoon to write will have an outsized benefit on usability."
WibiData initially developed its entire software stack alone, but in September 2012 it decided to make part of that stack available as open source and released the Kiji project in November.
Offering some tools as open source benefits WibiData in several ways, most notably by broadening the company's user base, says Kimball. Fundamental layers of the stack have a low value, and users won't pay for tools that aren't unique to their business, especially if similar tools are available. Open-sourcing those layers introduces new users to other WibiData offerings. "There are plenty of people who can make use of these components who [weren't] customers or potential customers, but now they're using and testing the same software that our paying customers use," Kimball says. "So everybody enjoys increased reliability of the overall system by virtue of it being more widely adopted."
Moreover, open source provides a foot in the door to companies that might not be ready for a big-data tool yet. "If common-based layers of our overall system are widely available through open source, [developers] might just start using it. And later on, when their organization needs to get serious about using an open-source application, it's much easier for us to go in and sell to those business users because our software already runs on parts of their stack. Interoperating with it and getting it to work with the rest of our systems is much easier rather than if they had built this same system in a completely bespoke fashion."
Kiji has received only a few contributions from its developer community so far, but Kimball believes that will change. "For every 15 people who use it, one might file a bug report -- without providing a fix. But it's very early days," he says. "Where this goes is an open question."
The future of open source in general looks bright. Broader adoption will create larger communities for testing and feedback, which in turn will drive innovation in areas such as cloud computing, mobile and big data, according open-source vendors.
The innovation cycle is also creating new business models. "Open source is key to a company's ability to innovate and sustain innovation with financial benefits, interoperability and a supportive community," Webb says. "Those are the things that are going to keep it going."

US Defense Department approves Apple's iOS devices for its network


Devices built around Apple's iOS operating system have been approved by the U.S. Department of Defense for use on its networks, as the department moves to support multivendor mobile devices and operating systems.
The Defense Information Systems Agency (DISA), which certifies commercial technology for defense use, said Friday it had approved the Apple iOS 6 Security Technical Implementation Guide (STIG).
"Approval of the STIG means that government-issued iOS 6 mobile devices are approved for use when connecting to DOD networks within current mobility pilots or the future mobile device management framework," the agency said in a statement.
The department earlier this month cleared BlackBerry 10 smartphones and PlayBook tablets with its enterprise mobility management platform BlackBerry Enterprise Service 10 to be used on its networks. It also approved Samsung Electronics' Knox, a new Android-based platform designed by the company to enhance security of the current open source Android.
The DOD mobility strategy includes mobile devices configured to the STIG, in combination with an actively managed and defended Mobility Device Management system, DISA said. The agency is responsible for establishing MDM, which provides a process for managing and distributing mobile applications and an enhanced cyberdefense infrastructure. DISA is running a pilot program to bring all the pieces together.
A DOD spokesman, Lt. Col. Damien Pickart, said earlier that a decision on Apple's iOS was expected to be taken last week. Several mobile devices and operating systems are currently going through DISA's STIG review and approval process, Pickart said via email earlier this month.
"We look forward to additional vendors also participating in this process, further enabling a diversity of mobile devices for use within the department," Pickart said. The approvals do not result in product orders, he added. Actual orders will be linked to operational requirements and availability of funding with user organizations, DISA said in its statement.
DOD currently has more than 600,000 commercial mobile devices in operational and pilot use, including about 470,000 BlackBerry devices, 41,000 running Apple operating systems and 8,700 on Android. A Commercial Mobile Device Implementation Plan aims to permit use of the latest commercial technologies such as smart phones and tablets, and to develop an enterprise mobile device management capability and application store to support approximately 100,000 multivendor devices by February 2014, DISA said.

Dell 79% fall in profits


The PC maker's net profit fell to $130m (£85m) in the three months to 3 May, on revenue down 2% to $14bn.
Dell is in the middle of a dispute between founder Michael Dell and two of its biggest shareholders.
Mr Dell wants to take the company private, but some investors oppose the plan.
Mr Dell, and private equity group Silver Lake, have offered to buy back the company for $24.4bn, and have pledged to shift the business away from PCs to mobile devices.
But its biggest shareholders - the investor Carl Icahn and Southeastern Asset Management - have argued that the valuation of the company is too cheap, and that Mr Dell's deal is a "giveaway".
Instead, they have proposed to offer additional shares to shareholders and install new management.
In its quarterly results, Dell said that revenue from new technologies, services and software, rose 12% to $5.5bn. That was in contrast to PC sales, which fell 9%.
The company did not issue a profit guidance for the second quarter due to the ongoing dispute. It has created a special committee of the board to study the private equity deal and alternative bids.




Imagine What Tumblr Will Look Like When Yahoo Buys It

Yahoo + Tumblr =

Ew.

Limited Upside For PC Assemblers

Gartner India says that the system builder market declined from 3.06 million units in 2011 to 2.33 million units in 2012. While Gartner does not track the individual components which are used to build a PC, Vishal Tripathi, Principal Analyst, Gartner India estimates that the PC building block market has seen a similar decline in growth rates.

The projections for the assembled PC market for 2013 are around 2 million units. With an estimated upgrade market of around 15 percent, the component market is estimated to be around the size of 2.3 million assembled PCs. Experts put the total size of the component market at Rs 4,600-5,500 crore.

“In terms of absolute numbers, the market still offers ample opportunities for thousands of resellers,” says Tripathi. “However, the overall size of the market has shrunk by 40-60 percent over the past 3-4 years.”

Microprocessors
The major hope for a revival of the desktop market is the launch of Intel’s fourth-generation Core i architecture code-named Haswell which is expected to be launched in June this year.


“We are committed to ensuring that the system builder market stays alive. We are also rallying behind OEMs to make technology available faster to our partners”
B Suryanarayanan, Director, Sales & Marketing, Intel, South Asia
Intel is driving its white-box PC system builder market ecosystem to focus on two markets, the ultra compact form factor and the PC-like embedded market. For the former space the chip major has announced a new form factor called Next Unit of Computing, and has started shipping the product to partners. “We are committed to ensuring that the system builder market stays alive and thriving. We are also investing in ensuring that new exciting form factors are developed, and are rallying behind OEM suppliers to make technology available faster to our partners,” says B Suryanarayanan, Director, Sales & Marketing, Intel, South Asia. “Apart from small form factors, we are urging channels to look into opportunities in areas such as digital signage, surveillance, PoS and gaming where growth is forecast for white-box PCs.”

Disk drives
The disk drive shortage following the floods in Thailand has been blamed for half the issues in the component market. “I believe that the component markets would not have taken such a beating if drive prices had not gone up. OEMs managed to get supplies at the right prices while assemblers had to pay heavy prices for getting stocks,” points out Sampath Kumar MP, CEO, Positive Systems, Kochi.

IHS iSuppli has predicted a decline of 11.8 percent in HDD revenue to $32.7 billion worldwide in 2013 compared to 2012. It further states that the HDD revenue will remain flat in 2014 at $32 billion.

“Component markets would not have taken a beating if drive prices had not gone up. While OEMs got supplies at right prices, assemblers had to pay heavily”
Sampath Kumar MP, CEO, Positive Systems
The best hope for the disk drive market is the expected demand for SSD drives as SSD prices drop further. “Many customers would prefer to upgrade their drives to SSD which would ensure that their PCs or notebooks run much faster, hence we are urging our channels to sell SSDs as an upgrade option,” says MA Mannan, Country Manager, Corsair Memory, India & Saarc.

IHS iSuppli predicts that the average selling price of hard drives will decrease by 7 percent in 2013. At the same time it expect SSD prices to fall by over 60 percent through the year.
Both Seagate and Western Digital have been shipping hybrid drives which use a small capacity SSD for either caching or storing critical data such as the operating system and traditional spinning disks to create larger capacities. While these drives are typically 40-100 percent more expensive than traditional drives, the performance is almost comparable with low-end SSD drives.

Motherboards
“The explosion of digital content is driving the need for storage and flash memory irrespective of whether a person is using a mobile, tablet or notebook”
Manisha Sood, Country Manager, Sandisk India
The biggest news last year was Intel’s announcement of its plans to quit the desktop motherboard market by 2016; this has given hope to many motherboard makers. “Intel dropping out of the motherboard business will help us to grab a bigger share of the market. Going forward we aim to increase our motherboard marketshare from the present 23 percent to 30 percent by 2013-end,” informs Vinay Shetty, Country Manager, Component Business, Asus India.

In H22013, the big bet is a transition to the new LGA1150 socket as Haswell processors hit the market. Many partners also expect Intel to cut its production of previous-generation processors to ramp up volumes and accelerate the transition to the new processors by year-end; this would get the motherboard upgrade market moving.

Some motherboard vendors are optimistic. Says Rajan Sharma, VP, Sales, Motherboard, Digilite, “We are expecting above-average growth in FY2013-14 if the corporate and SMB segment turn around. Increasing focus on small form factor PCs and penetration into Class B- & C cities will drive the segment.”

Graphics cards
According to AMD India, the independent graphics card market is hovering between 1,50,000 to 2,00,000 units, and growing. Says Vikas Gupta, MD, Technology & Gadgets, a Mumbai-based authorized distributor of AMD and Nvidia graphics cards, “We have witnessed a 50 percent increase in our sales compared to early 2012. We used to sell 5,000 units per month, but now, for the last five months, we are selling 7,500 units. The demand is more because of the enriched content from social media sites, YouTube and movie watching, as well as professional use by design and animation studios.”

Adds Rajesh Gupta, Country Manager, India, Zotac, “The demand for professional graphics cards is quite high because India is slowly emerging as a destination for creative outsourcing services.”

Memory
According to IDC, after an increase of 21.4 percent in 2012, the average growth of DRAM content per PC is expected to decline to a record low of 17.4 percent in 2013.

Opportunities in the memory market will be limited to opportunistic players because prices are expected to keep fluctuating. “We do not expect any major transition to happen in the next few quarters as it will be DDR3 only. Opportunities for upgrades remain good, with costs being low especially for notebooks,” says Piyush Pandey, Marketing Manager, Strontium India.

Removable storage
The removable storage market is expected to be healthy. “The explosion of digital content is driving the need for more storage and flash memory irrespective of whether a person is using a mobile, tablet or notebook. We see new technologies such as 3-bit-per-cell (X3) also improving the overall reliability of the solution,” says Manisha Sood, Country Manager, SanDisk India.

“As the market matures there will be opportunities in gaming PCs, workstations and other embedded devices. System builders need to find their niche in these areas”
Sushmita Das, Country Manager, Kobian India
With USB 3.0 slowly becoming a standard option in newer motherboards, PCs and notebooks, Sood expects many users to scrap older USB 2.0 drives and buy new USB 3.0 drives.

ODDs
With a shift toward smaller form factors and with the proliferation of Internet broadband, the optical drive market is declining rapidly. The average prices of DVD writers fell to less than Rs 1,000 last year. While further price drops are not expected, channels do not see demand for optical drives as a standard option with PCs. With the faster USB 3.0 media being preferred even for installation of software, the days of optical drives are numbered.

Outlook
While the assembled desktop market which fuels the building blocks segment has declined rapidly over the past two years, most vendors are optimistic that the decline will be arrested, and that the next two years may see a flat market, which seems to be the best news for them.

“Growth will be flat, but the market size will remain stable for the next two years, following which the size may reduce with the changes in technology. However, as the market matures, there will be significant opportunities for assemblers in gaming PCs, workstations and other embedded devices. System builders will need to find their niche to leverage these opportunities,” says Sushmita Das, Country Manager, Kobian India.     

Can Yahoo Buy Its Way Back To Relevance?


With the $1.1 billion acquisition of Tumblr apparently wrapped up and ready to be announced Monday morning, Yahoo and its hard-charging CEO Marissa Mayer are making a big bet that the Internet pioneer can return to relevance.
Since she took over Yahoo 300 days ago, Mayer has shaken up the staid company with everything from controversial work policies to a wholesale turnover of the executive team to a flurry of acquisitions. But the purchase of Tumblr shifts her attempted renovation of Yahoo onto an entirely new plane.

It’s easy to find good reasons for the deal, since Yahoo’s continued struggles to keep up with rivals such as Google and Facebook make almost any bold move look good. But two loom largest:
* Relevance
As many have pointed out, Tumblr’s appeal to the current young generation of people online is what Yahoo itself has to rekindle. Or, as this deal indicates, what Yahoo has to buy wholesale and hope it can avoid screwing up. Tumblr clearly has struck a chord with people who want an easy, flexible, and creative way to express themselves to friends and the rest of the world. The startup has managed to achieve that delicate balance even amid so many other choices, from WordPress to Facebook to Twitter.
What’s more, while Tumblr initially struggled like many an established online company with people’s rapid shift to accessing content and communicating on mobile devices, it has recently seen its mobile traffic skyrocket. Yahoo, the iconic Web company, needs as much mobile mojo as it can beg, borrow, buy or steal, so Tumblr should help it in a big way.
Not least, Mayer’s latest move, like many of her others, has people noticing Yahoo again, and not because of some new train wreck. As former Yahoo executive and current LinkedIn CEO Jeff Weiner tweeted: “Whether new policy decisions, product launches, or acquiring Tumblr, Marissa’s got people talking about Yahoo again in a meaningful way.”
There’s no telling if talk will turn to use and engagement, but it’s a necessary start. And it’s one that no previous CEO, at least in a positive way, has been able to do in, oh, about 15 years.
* Revenues
Seriously. Everyone knows Tumblr, whose CEO David Karp famously (and, in Silicon Valley, far from uniquely) once said he didn’t like advertising, isn’t making much money. That’s leading many people to say (again, not uniquely) that Yahoo is way overpaying.
They’re missing the point. Tumblr has bet on so-called “native” advertising that, like Google’s search ads, Facebook’s news feed sponsored stories, and yes, even television commercials, fit into the flow of what people are already doing on those respective media.
What Yahoo brings to that native advertising, author and Federated Media Publishing Executive Chairman John Battelle correctly recognizes, is scale. The knock on native advertising is that advertisers don’t want to create a different ad for each site, because they can’t reach people at large scale that way.
Yahoo, which for all its faults still has a very large audience, could bring scale to Tumblr’s native advertising, especially if former Google executive Mayer can recharge and apply Yahoo’s automated ad technology to it. Native ads at scale could finally woo brand advertising dollars from television, especially as people’s video consumption continues to fragment.
If even some of these scenarios come to pass–and not everything has to work perfectly–$1.1 billion will look like chump change. For pete’s sake, Yahoo paid almost $6 billion for Broadcast.com, which became–well, perhaps Mark Cuban can remind us what. Yahoo spent $3.6 billion on GeoCities, which became Yahoo GeoCities, never to be heard from again. Overture Holdings, inventor of the business model (if not the implementation) that made Google what it is today, cost it $1.6 billion.
Indeed, those failures (OK, maybe not Overture, which at least gave Yahoo a decent search ad business for awhile) are why a lot of folks have their doubts about Tumblr, or at least the price to buy it. (That, and the lack of revenues and all the porn.)
Clearly such doubts are on Mayer’s mind. Reports indicate Yahoo also may have some news Monday on Flickr, the photo sharing site it bought when it was hot years ago, only to let it languish like it did with countless other acquisitions. I have to believe that whatever Mayer announces, she will attempt to send a message that sounds something like this: We’re fixing Flickr, so you can be sure we’re not going to fumble Tumblr.
But let’s get real. Regardless of all the pontificating you’ll hear in coming days and weeks, no one knows the answer to that question in the headline. Mayer has only barely reversed Yahoo’s decline. So all the detailed explanations, like those in this post, of how Tumblr helps Yahoo like YouTube helped Google or like Instagram might help Facebook? They’re all moot. So are all the predictions that Tumblr is GeoCities 2.0 and Mayer is chasing an asset that will start depreciating as soon as she buys it.
That’s why, despite everything you’ll read, Yahoo’s future won’t be decided Monday when the company announces it will buyTumblr. It will be determined by what Mayer and her team do with Tumblr, and many other acquisitions and homegrown initiatives, over the next two or three years.











_________________________________________________________________________________




Study: 78% Of Salespeople Using Social Media Outsell Their Peers

When Jim Keenan, the social sales specialist, describes his work today, he’ll tell you that he’s “ushering salespeople from the old world into the social world” – the cold calling world to the Twitter world, the salespeople who call prospects incessantly to the salespeople who educate their prospects with relevant content. Keenan’s argument in the The Rise of Social Salespeople is that using social media to sell – increases profits.

But up until now, we’ve had no real data.  So sensing an opportunity, Keenan’s firm recently released a report on the impact of social media on quota attainment and the results were impressive.
The most interesting finding was that in 2012, 72.6% of sales people using social media to sell out performed those who weren’t using social media. He tells me he wasn’t expecting a number that high. Then, Keenan found that when it came to exceeding sales quota (exceeding quota by more than 10%), social media users were 23% more successful than their non-social media peers. Keenan told me that no matter how you sliced the data, social media users came out on top. (Note: You can download The Impact of Social Media on Sales Quota and Corporate Revenue here:)
I realize that many will argue that the numbers may mean more correlation than causation -and they have a point. But consider that over half of the respondents (54%) who used social media tracked their social media usage back to at least one closed deal. Over 40% said they’ve closed between two and five deals as a result of social media and more than 10% of the respondents said; “Yes, It directly contributes to my closes.” Respondents were very clear. Social media was a leading factor in their closed deals.
As you dig deeper into the data, it continued to support the premise, social media helps sales people make quota. Let’s take a look at Keenan’s data:
clip_image002
Social media users have also exceeded quota (exceeded quota by 10% of more) at higher rate than non-social media users every year since 2010. That means more social media users are at Presidents Club than non-social media users.
clip_image004
Not only do social media users achieve and exceed quota (win) more often than non-social media users, they also don’t miss quota (lose) as often. In 2012 non-social media users missed quota (by more than 10% or more) 15% more often than social media users.
clip_image006
With numbers like these and the success sales people are having using social media it begs the question, how much time are sales people spending on social sites and the answer from Keenan’s report is quite surprising. 50.1% of sales people who report using social media state that they spend less than 10% of their selling time using social media. That’s decent ROI.
Keenan says the top social selling sites were, in order, Linkedin, Twitter, Facebook, Blogging, Google+, other. He also tells me that almost 75% of the sales people surveyed said they have not received formal training from their company on how to use social media at all. I’m guessing that it’s primarily due to sales management and their lack of social sales understanding.
Keenan reminds me that social selling is not a panacea. But as he’s shown, those who have been using it are quickly gaining a competitive advantage. If Keenan’s data is statistically valid, then it’s clear. Social media can positively affect quota – which impacts revenue – which leads to better growth opportunities for business.
That also means that Linkedin, Twitter, Facebook, Foursquare, Google Plus, a blog, etc. are no longer nice to haves, they are salesperson must haves.


http://www.forbes.com/sites/markfidelman/2013/05/19/study-78-of-salespeople-using-social-media-outsell-their-peers/

Google, Zoho and Microsoft - Cloud App

Office Apps on the Cloud 

Many partners who took the CRN Cloud Adoption Survey feel that the easiest public cloud computing service to offer through channels is migrating on-premise office applications to the cloud. Google, Zoho and Microsoft have more than 2,000 partners in the country selling their office applications on the cloud. Commission range from 12-19 percent annually on new subscriptions and 6-10 percent for existing customers. Ever since Microsoft announced the Office 365 Open program, the business model of billing through a distributor, flexible pricing and payment terms for customers, things have become easier for partners. Services opportunities: Migration per user from on-premise is expected to fetch a one-time fee of Rs 500-1,000.

UC on the cloud

A growing opportunity is emerging with UC off the cloud, and apart from the big players, many niche players have emerged. While Microsoft offers Lync as an added feature with Office 365, competition is coming from Cisco, Digium and Avaya which have announced hosted UC offerings. Other players such as Super Receptionist, Knowlarity, Twilio and Plivo allow SIs to offer telephony solutions off the net. Services opportunities: There are huge opportunities for SIs to replace traditional PBX with public telephony solutions. Cisco estimates services revenue to be almost equal to the first year’s annual hosting costs.  

 Storage/Backup/DR

While primary storage is still not considered suitable for movement to a public cloud because of latency issues, content which is often served over the net is fast moving to a public cloud. Amazon’s public storage service S3 is being used by Web application owners to host static content. Akamai and Amazon CloudFront have emerged for providing caching content. Meanwhile, secondary storage, backup and archival are emerging as the hottest opportunities with almost two dozen vendors emerging. Many channel partners have also set up their own white-labeled public cloud practices. For instance, Sanovi Technologies has started offering solutions of public clouds, while Dropbox recently announced a corporate plan. For archival, Amazon has announced Glaciers, which offers 1 TB of archival for as low as $10. Services opportunities: Apart from migration, channels can play a services role in consulting, integration and maintenance.  

 Business Applications

Salesforce.com was the pioneer in moving CRM to the cloud. Today, Microsoft is offering Microsoft Dynamics online, while SAP has announced multiple partnerships to offer various SAP applications on the cloud. One of the biggest challenges so far has been the massive customizations which business applications require while being hosted off a public cloud. Vendors such as TCS have found solutions to sell hosted ERP in a consultative model. While Tally has stayed away from hosted models of its ERP solution, a number of Tally partners have set up cloud-enabled models around Tally. Services opportunities: Services opportunities remain more or less the same as in the case of on-premise models with consulting, integration, training and customization.              



IT Management

Almost all vendors in the IMS tool space have announced cloud initiatives. While Kaseya has had a beta version for long, Zoho and Sapphire have a hosted model. However, partners expect Microsoft with Windows Intune to provide the biggest surprise because 93 percent of Indian SMBs run mostly Windows-based infrastructure. Services opportunities: Automated Managed Services are emerging as the future for IMS, hence partners are expected to offer cloud-based IMS-as-a-service assurance rather than as a value addition.                 

Infrastructure & Platform

Globally, the most public cloud services revenue is around PaaS and IaaS. In 2013, Microsoft, Amazon and Google are expected to see revenue in excess of a billion dollars each. However in India few channel partners have started offering services around these platforms, one reason being that migrating on-premise applications to these platforms requires skill-sets which not all partners possess. However, with Microsoft and Amazon investing in training and support, analysts expect more partners to venture into offering solutions on top of an Amazon EC2, Microsoft Azure, Red Hat OpenShift or Google AppEngine. Services opportunities: Amazon estimates that resellers can make services revenue that is up to $2 for every dollar spent on hosting charges for the first year, apart from recurring commissions.