Posts tagged Been
Matt Cutts Explains How You Can Tell If Your Website Has Been Hit By A Particular Algorithm by @mattsouthern
Matt Cutts, Google’s head of search spam, answered a question about Google penalties in his latest Webmaster Help video where a user writes in to ask: How can you tell if your site is suffering from an algorithmic penalty, or you are simply being outgunned by better content? Although the user didn’t specifically ask about […]
View full post on Search Engine Journal
Negative SEO: Have Mercenaries Been Hired To Torpedo Your Search Rankings?
Schemes like this, unfortunately, are an aspect of an emerging trend we're seeing when it comes to negative SEO. Campaigns like these are used to point thousands of poor-quality, spammy links at a competitors' website in an attempt to cause their …
View full post on SEO – Google News
A post went up on Google’s official Webmaster Central Blog last night from a representative of the Search Quality Team providing tips for how to find out if your site has been hacked, as well as fix it and prevent future incidents. Since hacking is surprisingly common I felt it was important to pass along […]
View full post on Search Engine Journal
Technology journalist Robert Cringley thinks IBM is doomed because it just sold its Intel server business to Lenovo. On the contrary, this may be the clearest indication that IBM may thrive. After all, given the trend toward cloud and build-your-own-datacenters, has there ever been a worse time to be selling enterprise servers?
“An Act Of Desperation For IBM”
Let’s be clear. Every incumbent hardware company is under the gun as low-margin cloud businesses boom. Amazon puts every hardware company under pressure and is even causing fits for those trying to make a business of selling private cloud technology.
Yet Robert Cringley, a longtime IBM critic, believes IBM “has sold the future to invest in the past,” referring to its mainframe business, which it retains. He goes on to suggest that, “little servers are the future of big computing” and that, “IBM needs to be a major supplier and a major player in this emerging market.”
Yes and no.
It seems clear that selling big hardware like mainframes is a dying business. Yes, enterprises will continue to buy it, but if the last few earnings calls from IBM, Oracle and their peers are any indication, big hardware is a difficult proposition in the age of cloud.
Not that the big incumbents are giving up on big hardware. As reported by ReadWrite in November 2013, Oracle CEO Larry Ellison believes the future datacenter will include purpose-built, big hardware and low-end commodity servers, with the latter constituting the core of enterprise workloads. But that core will not powered by Oracle. Or IBM. Or any mega-vendor.
The problem is that these legacy server companies are not buying into that “purpose-built,” insanely expensive hardware, either. Hence, while CA Technologies may like to pretend that the mainframe is an integral part of the “data center of the future,” as a recent Wall Street Journal advertisement proposes, IT buyers aren’t buying.
Why Not Sell “Little” Servers?
If big hardware is struggling, why shouldn’t IBM, Oracle and other enterprise incumbents trade in commodity servers? In large part, they can’t. Not while being profitable anyway.
The commodity server business has been further commoditized by the rise of white box server vendors and open-source datacenter initiatives like Facebook’s Open Compute project. As Accenture writes, “Facebook’s Open Compute Project is accelerating the adoption of infrastructure innovations by sharing those breakthroughs freely.” For incumbent server vendors, “freely” is the last thing they want to hear.
It may be on the verge of getting even worse. According to McKinsey & Company, in 2014 enterprises need to increase their emphasis on private cloud deployments:
Many large infrastructure functions are experiencing “cloud stall.” They have built an intriguing set of technology capabilities but are using it to host only a small fraction of their workloads. It may be that they cannot make the business case work due to migration costs, or that they have doubts about the new environment’s ability to support critical workloads, or that they cannot reconcile the cloud environment with existing sourcing arrangements. Over the next year, infrastructure organizations must shift from treating the private cloud as a technology innovation to treating it as an opportunity to evolve their operating model.
If this happens, and there are good reasons to believe enterprise developers will continue to skip the private cloud in favor of public cloud options like Amazon Web Services, it won’t serve enterprise hardware companies very well. With increasing interest in open datacenter designs, enterprises can utilize private clouds with low-end, white box vendor servers rather than higher-cost, name-brand servers from the likes of IBM.
Which, presumably, is one big reason IBM sold its commodity server business.
The Future Of Hardware Is Software
Venture capitalist Marc Andreessen argues that “software is eating the world.” Along the way, it’s also eating hardware. At least, the fancy name-brand hardware that used to mint billions for IBM and its peers.
This is what Cringley misses. He blithely suggests of IBM that, “they are selling a lower-margin business where customer are actually buying to invest in a higher-margin business where customers aren’t buying.” This is true. But it doesn’t lead to his conclusion: “IBM needs to learn how to operate in a commodity market. IBM needs to become the lowest cost, highest volume producer of commodity servers.”
This is like suggesting that IBM needs to slit its right wrist instead of its left wrist. In either market, IBM is going to lose. The difference is that it can milk the high-margin, fading business for years as it tries to transform itself into a commodity cloud computing business. With the acquisition of Softlayer, it is well on its way, though the journey will be brutally painful.
Which, I suppose, is how I’d describe any company trying to make a living peddling hardware. Or cloud, for that matter. The cloud is compressing margins on all hardware businesses, even as Amazon forces would-be cloud competitors into a game of low-margin commodity cloud pricing. For hardware companies, it seems to be a lose-lose proposition. But it may be the only option they have.
View full post on ReadWrite
Don’t say we didn’t warn you. Bad guys have already hijacked up to 100,000 devices in the Internet of Things and used them to launch malware attacks, Internet security firm Proofpoint said on Thursday.
It’s apparently the first recorded large-scale Internet of Things hack. Proofpoint found that the compromised gadgets—which included everything from routers and smart televisions to at least one smart refrigerator—sent more than 750,000 malicious emails to targets between December 26, 2013 and January 6, 2014.
The hack came to light over the relatively quiet holiday period when a security researcher at Proofpoint noticed a spike in thousands of malicious messages sent from a range of IP addresses she didn’t recognize, David Knight, a Proofpoint executive in charge of information security products, told me in an interview.
Curious, she began pinging the devices and soon realized that they weren’t PCs, the usual platform for launching this sort of attack. Instead, many were otherwise unidentified devices running a standard version of Linux. Pinging one device brought up a login screen that : Welcome To Your Fridge. She typed in a default password—something like “admin” or “adminadmin,” Knight says—and suddenly had access to the heart of someone’s kitchen.
As the age of Smart Everything dawns, it’s also bringing online a host of largely unsecured smart devices like TVs, refrigerators and even toasters. Those devices are often trivial for knowledgeable hackers to compromise, opening new opportunities for malicious actions of various kinds—of which the malware attack Proofpoint identified may be among the mildest.
“Embedded operating systems deployed in firmware tend to be old, not patched very frequently, and there are known vulnerabilities to virtually all of them,” Knight said. Proofpoint’s investigation highlights how vulnerable connected devices are and how easy it is for hackers to take advantage of them.
Hacking The Home
Craig Heffner, a security researcher that teaches a class on exploiting connected devices, told ReadWrite in December that his students are usually surprised by the lack of security in connected home devices.
“If you look at the vulnerabilities being published, they’re not sophisticated,” he said. “Usually, the vendor put a back door in the product and someone took advantage.”
Worse, connected home devices often running on outdated software may be difficult or even impossible to patch. Security expert Bruce Schneier detailed the wild insecurities of the Internet of Things in a recent column for Wired:
[I]t’s often impossible to patch the software or upgrade the components to the latest version. Often, the complete source code isn’t available. Yes, they’ll have the source code to Linux and any other open-source components. But many of the device drivers and other components are just “binary blobs” — no source code at all. That’s the most pernicious part of the problem: No one can possibly patch code that’s just binary.
Malware isn’t the only thing people have to worry about. Knight said hackers could use compromised smart devices to launch distributed denial of service (DDoS) attacks aimed at knocking target Websites offline, mine bitcoins, or store stolen or otherwise illicit data.
Knight suggests the first step in protecting your gadgets is to change the default passwords. Beyond that, if you don’t need your device connected to the Internet, then don’t connect it.
“Don’t plug it in if you don’t plan to use it,” he said. “If you do put it on the Internet, try and make sure you put it behind your personal router and firewall in your environment.”
View full post on ReadWrite
Has SEO Been Reincarnated By Content Marketing?
Business 2 Community
Back then, many SEO experts would have told you to make the link building process as quick as possible. That's why profile links on forums and social media sites were so popular. So were high-PR blog posts that accepted links in their comments. And, if …
View full post on SEO – Google News
Guest author Melissa Olsen is a passionate fan of classical music and is currently the marketing director of Morbie.
Early this year, Swedish development company DoReMIR Research finally released its widely-praised ScoreCleaner app, a program that takes musical notation with impressive accuracy. It’s been promoted as the answer for would-be-composers who need help converting their ideas into something more shareable in the musical community. It’s a great example of machines helping humans produce art as music.
It’s not alone, of course. There are many more examples of machines helping the musical process along. It was a reverse talk-back circuit that gave us the gated reverb of the 80’s (think of the drums from “In the Air Tonight”) and auto-tuning gave us the distinctive sound of Daft Punk’s greatest hits.
On a more advanced level, we also have vocal synthesizers now: the Japanese Vocaloids, programmable “singers” in a box who can be made to sing the compositions of whomever happens to own the program. Machines, it seems, have been helping composers do their work for quite a while.
In fact, they might be able to replace the composers already.
The Artificial Creator
Enter such visionaries as David Cope, who is responsible for the controversial Emmy and the later but still controversial Emily Howell. Emmy was the first of Cope’s fascinating creations, a program of Experiments in Musical Intelligence (EMI) that eventually became so advanced that it could recreate famous composers’ musical styles in new compositions.
How did it work? Essentially, Cope fed Bach’s compositions to the program in order to establish a database whereby the composer’s patterns could be deduced. Then he added another database for deriving Bach’s rules of composition. This wasn’t just the standard way he strung notes together but also when he went against his own “formulae” (or deviated from his own standard). And then he added classifications of phenomena found in music.
The result? By 1987, Emmy’s Bach-style compositions were being played to a speechless audience at the University of Illinois. And polarizing listeners around the world.
A Composer Without a Soul
Cope’s work with Emmy and the program he regards as her daughter, Emily Howell (a more interactive composition program that actually engages Cope in an exchange that helps it put together music gradually through associations and Cope’s preferences), has elicited powerful responses in many people. Some welcome it, but perhaps even more view it with distrust.
Critics have complained that Emmy’s work had no “soul” … even though several of them said the opposite of the same pieces before they were told who or what Emmy was.
The fear seems to be that this technology is an incursion into what many think the “final frontier” of human qualities: creativity. Indeed, given the frightening success of Emmy’s work, some might even say this incursion is an outright invasion, from such a perspective. But a core part of the problem has to do with the argument that creativity or emotional meaning can only stem from “a soul.”
The idea of a soul isn’t something we can really properly explain. It’s not even something we really understand, which is probably why there’s a tendency to imagine it as something divorced from machines (which we can explain and understand). But it’s an important point in this question of music coming from the machine. After all, if one really looks at the relation between the artist, art, and the viewer, isn’t the “soul” of art more a function of the observer’s perception than a possession of the artist him/her/itself?
“Real people” have produced art that’s been called “soulless.” Cope’s program has produced music that some have found desperately moving. Why should the artificiality of an artwork’s creator render the art itself any less genuine for the observer affected by it?
The Future Of AC And Music
Using mathematics and formulae to create art isn’t new. Many classical composers, Mozart included, were already doing it long before. Much of the resistance to Emmy seems to be in its removal of the human from the process of creation, its ability to do all the work itself.
Still, Emmy had its limitations. Cope himself stopped using it and “trashed it” by destroying its extensive database, doing so partly out of concern that its very success at what it did, its amazing productivity, was becoming its own curse. For even the most beautiful of artworks can be dulled somewhat by relentless repetition of its theme.
Cope’s more interactive program, the second-generation Emily Howell, is less reliant on reproduction by repetition. It also doesn’t make the human composer fully unnecessary to the process, so it might be (slightly) less alarming to those who received Emmy with such negativity.
But even with Emmy, humans were never fully unnecessary. It took humans to process and draw personal meaning from its works, as noted above. And if the day comes that people come up with programs that can do that as well, that can respond to stimuli like music with that sort of full-fledged perception, who can still argue about soullessness in the machine then?
Photo courtesy of Flickr member North Cascades
View full post on ReadWrite