Saturday, July 28, 2012

Google, not wanting to be outdone by Bing, has updated Google Maps and Google Earth with new high resolution imagery that encompasses 25 cities and 72 countries. Google had previously announced they were taking aerial shots of the globe to include in Maps and Earth and this imagery is finally starting to see the light of day.

google_updates_maps_and_earth_with_new_high_res_imagery

Quite frankly, the imagery is gorgeous. Since the Olympics are in town, Google is of course using Olympic Park and Village in London as the poster child, as you can see in the picture above. Hands down, this level of detail makes Google Maps the number one mapping service. If you look at competitors, it isn't even close.

Google has also increased its 45* offerings with 21 U.S. cities and 7 locations internationally. Google explains which locations got what:

Cities with new high resolution 45° imagery: United States: Anderson, CA; Beech Island - New Ellenton, GA; Cape Girardeau, MO; Carthage, MO; Chicago, IL; Clarksville (outskirts), TN; Columbus - Reynoldsburg, OH; Dayton, OH; Everett, WA; Galena, KS; Idaho Falls, ID; Joplin, MO; Lafayette (outskirts), LA; Lancaster, CA; Louisville, KY; Lowell, MA - Nashua, NH; Pittsburgh, PA; Pueblo (outskirts), CO; Redding, CA; Springfield, IL; Yuba City, CA.
International: Birmingham, UK; Catania, Italy; Denia, Spain; London, United Kingdom; Meyrin - Vernier, Switzerland; Munich, Germany; Neuchatel, Switzerland.

Areas with new high resolution aerial imagery:
United States: Antelope Wells NM; Bryce Canyon UT; Green Bay WI; Huron, SD; Hutchinson, KS; Olympia WA; Park Hills, MO; Peach Springs, AZ; Phoenix, AZ; Placerville, CA; Riverside, CA; Rosenfeld, TX; Waverly, OH.
International: Wiener Neustadt, Austria; Alicante, Spain; Denia, Spain; Gandia, Spain; Las Rozas, Spain; Lugo, Spain; Santander, Spain; Sueca, Spain; Vitoria, Spain; Bern, Switzerland; Geneva, Switzerland; Nyon, Switzerland.

Countries/regions with new high resolution satellite updates:
Argentina, Australia, Austria, Belarus, Botswana, Brazil, Bulgaria, Burkina Faso, Canada, Cape Verde, Chile, China, Colombia, Croatia, Cuba, Cyprus, Djibouti, Dominican Republic, Egypt, Eritrea, Estonia, Ethiopia, France, Greece, Greenland, Guinea-Bissau, Guyana, Haiti, Honduras, Hungary, India, Indonesia, Iran, Italy, Jamaica, Latvia, Lesotho, Libya, Lithuania, Madagascar, Mauritania, Mexico, Morocco, Namibia, Nepal, New Zealand, Nicaragua, Pakistan, Papua New Guinea, Paraguay, Peru, Poland, Romania, Russia, Saudi Arabia, Serbia, Slovakia, Slovenia, South Africa, Spain, Switzerland, Tunisia, Turkey, Ukraine, United Kingdom, United States, Uruguay, Venezuela, Western Sahara, Yemen, Zambia, Zimbabwe

Start looking now as it will take you days to check out all of the new imagery.

Source: - http://www.tweaktown.com
SEO Expert Oracle Digital confirms release of the latest Google algorithm update and explains how enhanced ethical SEO practices will play a primary role in dealing with the latest changes rolled out by the giant Internet company.

After weeks of speculation, Google has released a new algorithm update following its last update, Panda 3.8, which was implemented exactly one month ago. This latest algorithm change, Google Panda 3.9, is just one of the series of updates that the company has carried out over the past months.

The latest Panda version was confirmed by the company in its official Twitter page, where it announced, “New data refresh of Panda starts rolling out tonight. ~1% of search results change enough to notice.”
According to SEO experts, the effects of this current update may not be as huge as the first ever Panda update implemented in February 2011, which affected 11.3% of search queries. On the other hand, there are still those who say that 1% out of the approximately 1 billion searches made on a daily basis is still a significant number.

With this latest development, many businesses and SEO practitioners are now monitoring their organic search volumes and rankings to measure the looming effects the update has brought about. Its full effects will most probably be felt within the next few days or weeks.

On another note, speculations have been hounding this slate of algorithm adjustments made by Google. According to some analysts, these were implemented to fix the major information glut the company is suffering from, while others speculate that this move is to help Google maintain its web dominance and remain competitive against the rising demand of social media. Most importantly, this latest change could improve user experience by giving them more relevant search results.

Experts and analysts are one in their belief that Google’s actions are meant to eliminate web spam in the World Wide Web – which means that all SEO efforts must use only the proper procedures and methods in order to avoid the negative impacts of both Panda and Penguin algorithms.

To effectively deal with these updates, SEO experts suggest superior content management by adding unique and relevant content to your site. Avoid duplicate articles, create interesting articles to draw readers’ attention and increase following, and many other white hat SEO techniques.

James Corby, Oracle Digital’s Business Development Director said, “The purpose of these Google updates has always been about quality. If you want to achieve good rankings and increased traffic, you must continue using white hat SEO techniques - from content creation which will result in holistic link building. Fortunately for us, this has been our practice since the beginning and our clients are enjoying the rewards.”

Oracle Digital is a leading SEO and Internet marketing company in Perth which specialises in numerous services, from SEO, reputation management, online press releases & communication strategies. The company is also known for using only ethical and efficient SEO techniques that comply with search engine standards.


Tuesday, July 24, 2012



Dear Google. Please don’t send out any further link warnings to publishers. Your latest round yesterday, intended to clarify the confusion sparked by ones sent last week, is likely going to make things worse, not better. No more warnings, not until you get some fundamental clarity in place.
Dear Publishers. Here’s our latest news on the crazy link warnings that have gone out and our best attempt at figuring out whether you should be concerned or not.

How We Got Here

Earlier this year, Google began sending out warnings to some publishers, alerting them that they were involved with “artificial” or “unnatural” linking. Many publishers that received these messages saw ranking drops, especially after Google’s Penguin Update.
Google later said that one way to recover from Penguin was to get bad links removed. It also said that anyone who received a link warning should take action to remove bad links, if they received one of those notices.

Last Week’s Confusion

Last week, Google began sending out a new round of link warnings. These were exactly the same as link warnings that had gone out in prior months, warnings that meant — according to Google — that a site might see a ranking drop if it didn’t act to remove bad links or otherwise report them in some way to Google.
Cue panic.
Cue next the head of Google’s web spam team Matt Cutts, who said not to panic, because the latest round of messages were different. These messages, Cutts explained, were meant to inform some publishers that there were links pointing at their sites that Google might now “distrust” but not something “you automatically need to worry about.”
Unfortunately, there was no way to tell if you got a link warning that you could safely ignore or not.
Cue confusion.

More Warnings, New Wording

Seeing the confusion, Google made a change over the weekend. Cutts commented on our original story, saying:
An engineer worked over the weekend and starting with the messages that we sent out on Sunday, the messages are now different so that you can tell which type of situation you’re in. We also changed the UI in the webmaster console to remove the yellow caution sign for these newer messages. That reflects the fact that these newer notifications are much more targeted and don’t always require action by the site owner.
Now, if you’re getting what I’d call a “link advisory” from Google, rather than the traditional link warning, it says something like this:
We’ve detected that some of the links pointing to your site are using techniques outside Google’s Webmaster Guidelines. We don’t want to put any trust in links that are artificial or unnatural. We recommend removing any unnatural links to your site. However, we do realize that some links are outside of your control. As a result, for this specific incident we are taking very targeted action on the unnatural links instead of your site as a whole.
If you are able to remove any of the links, please submit a reconsideration request, including the actions that you took. If you have any questions, please visit our Webmaster Help Forum.
I’ll get back to interpreting this message in a moment. The new messages, the ones that you’re apparently safe to ignore (not that the message makes this clear), appear listed under the “All Messages” area when you log into Google Webmaster Central. They may look like this:

In the screenshot above, “Warning” points at an example of the message that went out last week. Notice the yellow warning sign next to it. Apparently, going forward from this past weekend, link warnings you need to act upon will always have this type of warning sign.
Marked as “Advisory” in the screenshot is an example of the advisory messages that began going out on Sunday. There’s no yellow warning sign, which is designed to reassure publishers that they don’t necessarily have to take action. Maybe.

Can You Ignore The New Advisories?

Maybe? Well, we’re trying to get more clarity from Google on the latest messages. Cutts had suggested that these types of messages were more to inform site owners about distrusted links pointing at them rather than require site owners to take any particular action. But the new advisories still have wording that may cause panic.
After all, the new supposedly-reassuring messages tell people that Google recommends “removing any unnatural links” or to submit a “reconsideration request.” Telling publishers to submit reconsideration requests by its very nature suggests they are getting penalized.
What’s a confused publisher to do? I think if you got one of the messages last week, don’t worry unless you also noticed a recent traffic drop from Google.
If you got one of the new advisories, you’re likely safe to ignore them. However — and now your head will really hurt — that might be a precursor to a future ranking drop that has nothing to do with your site being penalized.

When Links Don’t Get Counted

For years, Google has said that it might not count some of the links it finds on the web as “votes” in favor of a particular web site. The Penguin Update seems to have ramped that up. The messages that are going out also seemed to be tied to this, to better alert publishers that votes they thought they were getting might no longer count.
Think of it as an election, where bogus links are like bogus votes. The ballot box has been stuffed, and those bogus links aren’t being caught. Candidates getting those votes get elected. Then the elections people start looking more closely at the votes and tossing out the bogus ones. Now the candidates that were winning no longer get elected, or elected as often.
Those candidates weren’t penalized. They weren’t barred from being in the election. They just weren’t allowed to benefit from bad votes.

Penalty Or New Way Of Counting Votes?

This is why Google has begun saying that Penguin isn’t a penalty but rather just an algorithmic change, where the algorithm is detecting what it considered bogus votes and not counting them. If you were a site with a lot of bogus votes, then you’re going to be hit harder than sites that have only a few of them among all the legitimate ones.
That’s also why Google hasn’t advised people to do reconsideration requests, if they were hit by Penguin. There was no manual action that could be removed. In other words, Penguin didn’t ban them from being in the election. It just didn’t count bad votes.
Confusingly, however, Google did advise people to clean up bad links. That suggests that Penguin does more than just discount bad votes. It clearly must somehow penalize sites that seem to have a lot of bad links. Otherwise, there would be no reason to advise removing bad links.

So Why Send Messages?

That leads back to why the latest round of messages may be going out. If Google is automatically counting bad links as a way to penalize sites, rather than just discounting them, then any site is potentially vulnerable to “negative SEO,” where someone might point bad links at a competitor.
Google has continued to discount the threat of negative SEO. SEOmoz just covered how an overt negative SEO attack hasn’t impacted its traffic. But the new advisories might be a poorly-implemented way of reassuring publishers about not to worry about negative SEO. They seem designed to give a heads-up that a site might not benefit from some links in the way it did before, not because the site faces a penalty but because the links aren’t counted.
Potentially, that’s helpful. It’s good advice for a publisher to understand they might not be ranking well because the link counting methodology has changed, not because they’ve done something wrong or because of some negative SEO attempt.
However, it’s bad advice not to be clearer, to be suggesting that publishers should be trying to actively remove links pointing at their site, if these links supposedly aren’t trusted. It’s worse to be telling them to file reconsideration requests if they’ve done nothing warranting reconsideration.
If Google wants to keep discounting an increasing number of links out there, that’s Google’s right. But publishers have better things to do than being dragged in as part of that ballot box policing, if they’ve done nothing wrong.


Source: http://searchengineland.com

Saturday, July 21, 2012

Patents provide exclusive rights to your inventions from the date of filing of the patent application, and are filed through USPTO. Gov – SEO related patents provide insights into search engines’ algorithms. The exact number of patents is not displayed online, but different reports have shown that Google owns hundreds of thousands, presumably, patents with USPTO.
Today, I would like to share my all time favorite SEO patents filed by Google that has provided me insights into their algorithm and helped me align our SEO strategies accordingly with respect to Google’s guideline.
______________________________________________________________________________________________________
#5 – AGENT RANK
Inventors: Minogue; David; (Palo Alto, CA) ; Tucker; Paul A.; (Mountain View, CA)
Filed: August 5, 2011
This patent gives insights into how Google might be using your identity online as a source to organize rankings in Google. Similar to page rank, this patent give scoring system to your profile (Google+) and associate your content with digital signatures.
Abstract of Patent
The present invention provides methods and apparatus, including computer program products, implementing techniques for searching and ranking linked information sources. The techniques include receiving multiple content items from a corpus of content items; receiving digital signatures each made by one of multiple agents, each digital signature associating one of the agents with one or more of the content items; and assigning a score to a first agent of the multiple agents, wherein the score is based upon the content items associated with the first agent by the digital signatures.
SEO Takeaways:
1- Your Google+ agent rank can influence your page rankings on Google if you have a high score. The score can be determined by the no # of people who followed you and how authoritative they are on the web and influential.
2- This can avoid duplicate content issues by associating your Google+ profile with your content as the first source of information.
______________________________________________________________________________________________________
#4 – Inferring Geographic Locations for Entities Appearing in Search Queries
Inventors: Karanjkar; Sushrut; (Fremont, CA) ; Subramanian; Viswanath; (San Jose, CA) ; Thakur; Shashidhar; (Saratoga, CA)
Filed:  December 16, 2011
Ever thought why a certain website is ranking higher in country specific Google even if the site has no relation with the country? On June 21, 2012 Google was granted a patent to log information in a query log file that tracks your clickstream data and rank sites based on clicks that are coming from specific locations.
What this means to SEO is simple. if you have a high number of people clicking your website from a specific terms in a specific country, it will have an big influence on your ranking in that specific country.
Abstract Of Patent
A server system associates one or more locations with a query by identifying the query, selecting a set of documents responsive to the query, and assigning weights to respective documents in the set of documents based, at least in part, on historical data of user clicks selecting search result links in search results produced for historical queries substantially the same as the identified query. Websites hosting the selected documents are identified, and, for each website, location-specific information for one or more locations is retrieved, including a location-specific score that corresponds to the likelihood that the respective location corresponds to a respective website. For each respective location for which location-specific information was retrieved, aggregating the location-specific scores, as weighted by the document weights, to compute an aggregated likelihood that the respective location is associated with the query. A specific location is assigned to the query when predefined criteria are satisfied.
SEO TakeAways:
1- Traffic data can actually affect your ranking and organic visibility according to that country’s search engine. Google wants to get very geo, and this patent is proof. Get local. Link build on local entities and talk about local things in content.
______________________________________________________________________________________________________
#3 – Ranking documents based on user behavior and/or feature data
Inventors: Dean; Jeffrey A. (Palo Alto, CA), Anderson; Corin (Mountain View, CA), Battle; Alexis (Redwood City, CA)
Filed: June 17, 2004
This patent specified that not every link on the page is of equal value, the patent says that aside from using anchor text, Google might be looking at other features associated with the links and may determine how much value it should give to the link based on these new features.
Some of the new features described in the patent are such as a) font size of anchor text B) the position of the link c) if the link is in a list d) number of words in the anchor text e) how commercial the anchor text associated with the link might be f) whether the URL of target URL is on the same domain and many more features.
Abstract Of Patent
A system generates a model based on feature data relating to different features of a link from a linking document to a linked document and user behavior data relating to navigational actions associated with the link. The system also assigns a rank to a document based on the model.
SEO Takeways:
1- Make sure your links on the page have proper attributes that the patent has laid out.
2- Links that is shown to the top of the page may carry for link value.
______________________________________________________________________________________________________
#2 - Presenting Social Search Results
Inventors: Callari; Francesco G.; (San Francisco, CA) ; Kulick; Matthew E.; (San Francisco, CA)
Filed: December 1, 2010
Google might start showing you results that from members of a specific group that you might be associated with such employee of your company or member of groups such as social clubs or fan club. The patent also covers real time result and how it might be associated with a google+ and how certain profile would more likely be showing up in real time search results based on several factors.
Abstract Of Patent
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for detecting malicious system calls. In one aspect, a method includes identifying members of a social affinity group of the user, the social affinity group having members having a relationship to the user. The method includes receiving search results including search results that reference resources associated with members of the social affinity group. The method includes identifying a first search result that references the social network site. The method includes identifying a second search result that references a resource found on the social network site and associated with a member of the social affinity group. The method includes generating a response to the query comprising instructions that the first search result be presented in proximity to the second search result.
SEO Takeaway:
1- Google is giving reputation score to profiles and use the profile score to influence search rankings.
______________________________________________________________________________________________________
#1 - Context Sensitive Ranking
Inventors: Garg; Ashutosh (Sunnyvale, CA), Dhamdhere; Kedar (Sunnyvale, CA)
Filed: March 9, 2009
For many years there have been speculations on the web that Google might be using clickstream data to organize the search result pages. With this new patent granted to Google on June 26,2012 – the patent says how Google might be re-ranking search results based on the context of what you click the most.
This patent says how your clickstream data might be used to show you different searches in the future during a particular session, and re-reank search results based on your click history. The new results will be biased based on the statistic associated with the search results.
Abstract Of Patent
Methods, systems, and apparatus, including computer program products, in which context can be used to rank search results. Context associated with a user session can be identified. A search query received during the user session can be used to identify a contextual click model based upon the context associated with the user session.
SEO Takeaways:
1- Create your content based on the topic modeling concept so you will show your content based on the topic that users are most likely be interested to see. The most common and arguably the most useful for search engines is Latent Dirichlet Allocation, or LDA. This model mainly consist of clustering words into “topics.”


Source: http://www.seoreseller.com/

Friday, July 6, 2012

Google wants to know how its employees can become more effective.
 ("Join the club," managers at every company in the world say in response.)
GoogleEDU, the company's two-year-old learning and leadership development program, is getting a boost in attention as the company takes its data-driven ways to the classroom.
Can you formalize learning for more consistent return on investment? That's the question Google is working to address as it enrolls some 10,000 employees in classes aligned with the company's overall strategy, dropping and adding classes for effectiveness almost as quickly as it swaps advertisements on its popular services.
Google's not alone in the endeavor -- after all, every major corporation has an education offering in place with the hope of keeping its employees' skills relevant. (ZDNet publisher CBS Interactive has one, too.)
The difference is how they're doing it in Mountain View. 
The Wall Street Journal's Joseph Walker explains:
Google thinks it has found a way to make its learning stick. It has become more exacting about when it offers classes and to whom. It uses employee reviews of managers—similar to the instructor reviews that college students fill out at the end of a semester—to suggest courses to managers. Ever data-obsessed, Google uses statistics gathered from current and former employees to recommend certain courses to managers at different points in their career, say after a move to a new city or joining a new team.
That matters more than ever when there are 8,000 new heads on your payroll, per the company's latest recruitment drive, particularly when they come from companies that aren't as progressive in terms of company culture and heirarchy.
(Think about it: how do you motivate reports that are demonstratably smarter than you? Fear won't do it.)
Some of the classes focus on exerting influence in a company that prioritizes ideas over titles. Others focus on areas of expertise. The end goal: target like crazy, from which you can more easily attribute actionable goals.
As we all know, Googles Panda and Penguin algorithm updates have thrown webmasters and search engine consultants for a loop, making the tried and tested methods they use to bump sites up the rankings ineffective, if not downright obsolete.
Of course, these updates are just the latest volley in the constant battle between Google " which tries to provide users with relevant results to searches " and those who seek to get their sites to the top by any means necessary.
So in the spirit of mischief, let's check out the most notorious ways in which people have exploited Googles algorithm.

10. George Bush Is A Miserable Failure

10. George Bush is a Miserable Failure
Depending on your politics, you may or may not agree with the sentiment, but a Google search for the term miserable failure would have returned George Bushs Whiteouse.gov page as the first result for a significant portion of the early 2000s.
The prank was accomplished by Google bombing " a mass action by various webmasters who in this case linked to the Whitehouse.gov page via the phrase miserable failure on their sites. When the prank was started in 2003 by the Old Fashioned Patriot blog, Google made a statement that they had no problem with the practice; however, come 2007 they decided to tweak their algorithm to minimize the effect of Google bombs. Why? Because people thought that these results were Googles opinion!

9. Quixtar Pushes Positivity

9. Quixtar Pushes Positivity

Google bombs are not just used for pranks and political commentary.
In 2004, reports emerged from a Quixtar talk alleging that a senior Quixtar (now Amway Global) representative had discussed the use of hired geekoids to post positive stories about them, so that negative websites would fall off the rankings.
The multi-level marketing company released a statement in which they maintained that they never knowingly violated search engine rules. However, online observers noticed that, soon after, Quixtars official site dropped to the third page of Google results.
As of May 2012, a search for Quixtar offers " in the top six " a result about how Quixtar sucks and a link to an NBC Dateline investigation into the company, so it seems that their efforts didn't completely pay off.

8. GoDaddy Gets Punished

8. GoDaddy Gets Punished
Back in December 2011, vast numbers of Internet users were up in arms about the Stop Online Piracy Act, or SOPA, so GoDaddys support for the bill was not looked upon with pleasure.
Happily sitting at number one for the Google query domain registration, GoDaddy were hit with a Google bomb whereby their ranking was targeted for replacement by an anti-SOPA domain registrar, NameCheap.
Like the miserable failure bomb, this was again caused by webmasters linking " in this case to NameCheap through the phrase domain registration.
Disseminated via social media and Hacker News, the initiative (along with a threatened boycott) caused GoDaddy to go back on its support for the initiative.
As of early May 2012, GoDaddy is ranked at number two for the search, right behind NameCheap " which just goes to show it doesn't pay to have unpopular opinions where the web's concerned.

7. DecorMyEyes Makes Lemonade Out Of Lemons

7. DecorMyEyes Makes Lemonade out of Lemons
Most business owners go out of their way to be nice to customers in order to avoid negative reviews. However, Vitaly Borker, the owner of eyewear store DecorMyEyes, noticed something interesting: the more reviews he got, the higher his search rankings, regardless of whether the reviews slammed his business or not.
In fact, he began to deliberately court bad publicity and upset customers in order to get even more negative reviews! It sounds crazy, as surely seeing these negative reviews will cause a searcher to move on, right? The thing is, though, searching for individual products that his site sold would still bring his site up prominently " as it had a high PageRank from all the angry link-filled reviews " but without the reviews themselves!
After a New York Times expos on the site, Google acted swiftly to institute an algorithm that tackled bad merchants. Borker pleaded guilty to threatening customers and fraud and in May 2012 was still awaiting his sentence.

6. Dan Thies Gets Google Bowled

6. Dan Thies Gets Google Bowled
Google bowling is one of the blackest tricks in the unethical, or black hat, search engine optimization toolkit.
Its a pretty simple idea. Google punishes websites that try to increase their ranking by breaking Googles rules; for example, by using splogs " fake blogs intended only to link to the target website " or the use of automatic commenting on blogs that push links.
It stands to reason, then, that setting up these tools for a competitors site can cause them to get a punishment they dont deserve, and make them fall down the rankings.
Dan Thies, SEO,  tweeted his pleasure that Google was tackling splogs, he became the subject of an experiment to prove that those being punished were not necessarily to blame. Just 10 days after he posted his tweet, his website had been flagged by Google for violations.

5. The Chocomize Story

5. The Chocomize Story
In July 2010, the word chocomize began to trend on Google " meaning, of course, that the number of people searching for it suddenly increased.
Chocomize itself is an inoffensive website where you can make your own customized chocolate bars. However, its trending status was a little stranger.
A CNN article was published about the company, leading to an initial spike in searches. Bloggers and website owners then discovered it was trending and took advantage of the opportunity for ad revenue by writing useless articles that would draw in traffic and give them sweet Adwords money.
All this activity led to a higher spike, causing the actual site and the original article on the company to slide down the rankings " in turn leading to useless results for searchers.

4. J.C. Penney Pays For Links

4. J.C. Penney Pays for Links
When you think of unethical web practices, you probably think of sleazy hackers or small companies trying to save a buck, not corporate giants like J.C. Penney.
However, in 2011 the department store was found to have been paying for links to its website, and Google gave it 90 days to sit in the corner and think about what it had done.
J.C. Penney denies knowledge of the scheme, blaming it on the SEO company they had hired, but the scandal brought the illegitimate technique into the limelight, and shows that nobody is above Googles wrath.

3. Forbes Sells Links

3. Forbes Sells Links
Of course, for sites to buy links, somebody has to be selling them, and it turns out that little blogs and websites aren't the only culprits.
Along with J.C. Penney, Google called out Forbes.com for selling text links that let other sites raise their Page Rankings. Even more shocking, 2011 wasnt the first time Forbes had been in trouble with Google. It was penalized back in 2007 for the same offense. And, it looks like they didnt learn their lesson, even going cap in hand to the Google webmaster help forum asking for clarification on which links were a problem.

2. Malware Targets High Rankings

2. Malware Targets High Rankings
For many people, Google is seen as the last bastion of objective, truthful information. Yet this trusting approach makes Google search results a tempting target for malware promoters.
A study of top search terms has shown that almost half of the top 20 search terms are constantly being targeted by malicious websites that try to get visitors to download malware such as fake antiviruses.
While Google strives to remove these sites as soon as possible, the numbers are staggering. Over just six days in March, for example, almost 300 top Google searches were targeted by around 6,500 malicious websites, and it's a continuing battle to take these sites down as soon as they pop up.

1. BuildMyRank

1. BuildMyRank
BuildMyRank and other link-building websites were a popular tool for webmasters and search engine consultants. However, from March 19, 2012, BuildMyRank is no more, with Google de-indexing almost the entire site.
Why? Because the links that the network provided, while great for website owners looking to rise up the ranks, gave no actual information about how relevant or useful the site was.
Links can be helpful for search results when they are genuine and when they are used because another site is considered relevant. However, these networks existed purely to abuse the system. BuildMyRank and similar sites were notorious as a cheap and easy way to rise up the rankings " its even in the name! Perhaps the only surprising issue is that it took Google so long to crack down on them.

Source: www.searchenginepeople.com

Tuesday, July 3, 2012

SEO Company Oracle Digital releases improved products and services to help strengthen the SEO campaigns of clients and customers after the release of Google Panda Update 3.8.

Perth, Western Australia (PRWEB) July 03, 2012

Oracle Digital, an established Perth digital marketing company, has strengthened its set of products and services in order to help clients keep abreast with the changes taking place over the online world.
This modification of services from the company was in response to rumours from observers and experts that Google has recently refreshed its Panda algorithm – which may affect the sites of numerous online businesses immediately. The last Google update was released on June 8. Consequently, on June 25 - Google confirmed that it was releasing its Google Panda 3.8.

The latest refresh has baffled many considering that their algorithm changes only took place once a month - unlike with the Panda 3.8, which was already the second for the month. But Google explained that this latest version is not, strictly speaking, an algorithm change. It then explained that the purpose was mainly to capture websites that needed to adjust with Panda modifications and also to release those that have complied accordingly.

With this statement, analysts believe that there is a big chance that website owners might be experiencing some fluctuations and changes in their Analytics reports and in their SERPs – which is why they are suggesting that the necessary changes and adjustments be done in order to avoid any of these untoward consequences.

According to established SEO professionals, there are numerous ways where one can use the new algorithm to its advantage, explaining that Google Panda was set out to be a tool to ensure quality of content, thereby serving as a filter to ferret out low-quality contents, links and other inefficient methods.

In addition, experts suggest that site owners must remove low quality site content and start creating content that is unique, original and deviate from making any duplicate content that is already present all over the Internet. Keyword spamming and having excessive ads are also considered as strategies that may result to a site or post being considered as low-quality.

In effect, industry analysts and experts believe that every site must aim to be an authority in order to become more successful in the niche or field that they are catering to. But, they must be able to use the proper strategies and procedures in order to generate authority.

James Corby, Business Development Director of Oracle Digital, shares his expert opinion regarding the matter. He said, “There is really no need to be afraid of any of the Google algorithm changes taking place, as long as you are using holistic techniques in your online efforts. You just have to remember that all that Google wants is quality and not necessarily quantity - if you are prepared to roll up your sleeves and contribute regularly to your industry online, then success will follow quickly after. Gone are the days of dodgy, surface level tactics to deliver online marketing success.”


Source:- http://www.sfgate.com/