mercredi 27 avril 2011

Taking Apart the Meta Robots Tag

March 17th, 2011 by John

A lot of site owners avoid the Meta robots tag and their robots.txt files, simply because they don?t understand either of them. As a part of code, both can be shrouded in mystery to the average non-tech business owner. They really aren?t all that much of a mystery, though, and they can be used to assist your site?s search engine optimisation.

First things first: what is the Meta robots tag? The Meta robots tag is a part of your code that communicates with the search engines. Just as it says on the box, it is an HTML Meta tag that communicates with robots. Robots don?t always do what it says, notably malware robots that ignore it to get on with their malicious deeds, but it?s an important part of your site?s communication with the non-human forces on the net. The robots.txt file, on the other hand, is a file in your root directory. It does pretty much the same thing, and is effective on all robots.

Optimising all areas of your site is crucial for good SEO

Issue 1: Meta robots tag vs. robots.txt

Both of these communicate with robots, and are practically the same. The differences are significant, though. Some webmasters recommend the Meta robots tag because you don?t have to access the root directory to alter it. This makes it helpful if you don?t have access. On the other hand, many recommend using robots.txt because it is more reliable, and prevents crawling by search engines as well as preventing indexing. If you?re using one, you don?t need to use the other.

Issue 2: Why would you need to use either the Meta robots tag or robots.txt?

A lot of sites choose to leave these alone, but both the Meta robots tag and robots.txt can be used to sculpt the search engines? communication with your site. For example, if you have pages you don?t want Google to bother about, then a simple instruction not to index will mean the page is left alone. If you do this in your robots.txt file, the search engine spiders will leave it alone permanently, whereas if this command is in your Meta robots tag they will come back to check whether you?ve changed your mind.

Other things that can be handy in terms of search engine optimisation include the use of the nofollow tag, the instruction not to archive, which prevents a cached version appearing in the SERPs, and the instruction not to take snippets. When you have an instruction specific for Google in your Meta robots tag, you can address it to Googlebot.

Issue 3: Seriously? Using ?nofollow? in my own pages??

Using nofollows can help you to sculpt the search engines? experience of your site, for example when you want more attention paid to a particular internal link. The Meta robots tag and your robots.txt file can help your site, and they can hinder it. If you?re at all uncertain about what effect the changes you?re making to your code are going to have, it?s best to consult your SEO company.

Link to us

If you want to link to this blog, copy and paste the following HTML code to your website.

Leave a Reply

Source: http://www.searchengineoptimization.co.uk/seo-blog/on-page-seo/taking-apart-the-meta-robots-tag.html

Ellen Degeneres Emma Watson Eva Longoria Eva Longoria Parker

Tips for Enhancing your Content Writing

April 19th, 2011 by admin

It takes time to develop the skill of writing balanced content for your website. Once you have mastered the art of producing material that appeals to both readers and search engines there are more enhancements you can use to make them even more effective:

Internal link building

Outstanding content is essential for successful SEO but internal link building is often tackled poorly. Internal links are vital for successful SEO campaigns and if done effectively will ensure positive page rankings. Pages are also easier to navigate as a result and by using tools such as SEO Smart Links you can automatically set up good quality links.

Title tags, URLs and meta descriptions

Page titles are often overlooked and yet these small details can have a big impact on SEO rankings. Using long winded titles can be mistake and cause issues with Google. Keep then short. The same applies to URLs ? use five words maximum. Also check out your meta descriptions.

Keyword research

Once the topic for an article has been found then look at the keywords for that area. Use free keyword research tools to keep your budget fluid. Be aware that the data is sometimes inconsistent, but you?ll gain some excellent insights. Use tools to highlight key phrase variations which can inspire new inspiration for further articles. Every new keyword and phrase can open up new topics to write about on your blog section or in information articles.

At www.searchengineoptimization.co.uk we offer a range of SEO services including link building and content management.

Link to us

If you want to link to this blog, copy and paste the following HTML code to your website.

Leave a Reply

Source: http://www.searchengineoptimization.co.uk/seo-blog/seo-content/tips-for-enhancing-your-content-writing.html

Jennifer Love Hewitt Jerry Bruckheimer Jerry Seinfeld Jessica Alba

lundi 18 avril 2011

5 Site Optimization Blunders that Will Kill Your Rankings

As I have worked with numerous websites throughout my career as a professional SEO, I have seen many different problems which have caused websites to not be able to rank well in the search engines. The following problems are the five biggest mistakes that I have seen made over and over again.

Blocking the Search Engine Crawlers

The Robots.txt file is a file which contains instructions for search engine crawlers telling them where they can and can?t go on a specific website. Crawlers look for this file when they first hit the website. You can prevent crawlers from visiting certain pages or folders on your website by using the Robots Exclusion Protocol in the Robots.txt file. The error occurs when the webmaster accidently blocks the root folder of the website in this file and prevents the crawlers from crawling the entire website. If your Robots.txt file has a line which looks like Disallow: / then you are blocking your entire website.

JavaScript Navigation

Many sites use JavaScript to create drop-down, accordion and other styles of navigation. These types of navigation can help make it very easy for visitors to navigate large websites. However, for search engine crawlers it can look much different. The problem with JavaScript is that while there is a fully functional menu for website visitors, there are no links in the actual source code. Search engine crawlers rely on links in the code to navigate the website. Disable JavaScript in your web browser and then look at your website. If you can?t see the site navigation then crawlers won?t see it either.

Too Much Flash

Flash can take a plain website and make it into an extraordinary one. Complete sites created in Flash can contain videos, images, animations and other features to result in a fantastic user experience. The downside to it all is the type of crawler experience it gives. Because all the elements of a website created in Flash are contained in the video file, those elements are not visible to search engine crawlers. Content and links in Flash do not exist to search engine crawlers. If your website is built in Flash, consider moving some elements outside of the Flash video. Include content on your website and some type of HTML navigation to help the crawlers navigate your website and know what each of the pages is about.

301 Redirects and Canonical Tags

301 redirects are used to tell crawlers that a page has been permanently moved to a new one. Canonical tags are used to tell crawlers that out of a series of similar pages, one specific page should be included in the search results. Canonical tags should not be used as 301 redirects. When not used properly, canonical tags can prevent proper indexing of a website. If you are using canonical tags you should evaluate what they are being used for. Does your website have multiple categories generated dynamically by the same script, which are all similar to each other with the only difference being the products that are displayed? Do you have pages on your website which can be displayed with several different URLs? Both of these situations would be ideal for using canonical tags.

Duplicate Content or No Content

Good quality content is critical to ranking well. Google has stressed the importance of having lots of good quality content on a website and their focus on content was made even more apparent with the Farmer / Panda Update. Crawlers rely on content to determine what a page should rank for in the search results. Because Google wants to give the best user experience, they are making a big push to show only websites with high quality content. Duplicate content, shallow or poorly written content was the focus of the Farmer / Panda Update. Sites that had problems with content quality saw a drop in rankings when the update was pushed live last month. Look at the content on your website. Do you have enough of it? Is it original or was it copied from another website? Expand on your content where the webpage has very little and write your own content for pages where it is copied from another source.


Tags: , , , , ,

Source: http://www.seo.com/blog/5-site-optimization-blunders-kill-ranking/

Tyler Perry U2 Venus Williams AC/DC

dimanche 17 avril 2011

Google Analytics will be upgraded to its New version soon

Web Analytics is a fantastic tool to identify the complete statistics of the visitors like where they are from, for what�keywords they are landing and so on. It helps not only on viewing statistics of the visitors to a website and also assist�webmasters to bring awareness where they lack in and how to improve the visitor rate. Lot of web analytics tool are available�in the Internet market today, one among them is Google Analytics developed by Search Giant Google. Google Analytics provide�the information about the visitors country and territory details, referrals, visitors from search engines, specific content�and page title, etc.

Google analytics new version

Google analytics new version

Today I noticed a new version link of Google Analytics tool available in the top right of the website. When compared to old�version, Google Analytics new version is well designed and looks glossy seem to be very user friendly in nature. This new�design comprises of three parts (i) Dashboard (ii) My site and (iii) Custom Reports. My site tab contains Reports and�Intelligence part. Reports section show the complete statistics of the visitors and Intelligence is a new feature helps the�webmaster to provide alerts about the visitors on daily, weekly and monthly basis.

Google analytics new version features

Google analytics new version features

Tags: , ,

This entry was posted on Monday, April 11th, 2011 at 9:33 pm and is filed under Google, Internet Updates, Web Applications. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

Source: http://www.cogzidel.com/blog/2011/04/google-analytics-will-be-upgraded-to-its-new-version-soon/

Roger Federer Rush Limbaugh Ryan Seacrest Sandra Bullock

samedi 16 avril 2011

What to Expect from Google Algorithm Changes

Google has gotten a lot of flack about search spam lately from both search experts and everyday users. Whether you?ve noticed or not, you probably see some form of search spam every time you enter a query into Google, Bing or any other search engine.

What is Search Spam?

Search spam is the high-ranking query results that seem to match what you were looking for yet offer no real information. Typical methods of achieving this involve keyword stuffing and manipulating relevance in order to trick search engines like Google and Bing into indexing a page as relevant.

Examples of Search Spam

Last week, Matt Cutts, the head of Google?s Webspam team, confirmed changes made to Google?s algorithm that will specifically target content scraper sites and content farms. The former are low-quality sites without original content and the latter are websites designed purely to answer search queries, such as eHow.com and about.com.

Blekko Takes the Spotlight

Time will tell if the algorithm changes make a difference in prevalence of spam in Google?s search results. An up-and-coming search engine called Blekko isn?t waiting around to find out. In light of the recent publicity surrounding Google?s webspam problem, Blekko has announced that they have banned the top 20 content farms from their index. This includes well-known websites such as eHow.com, encyclopedia.com and thefreedictionary.com.

Overall Impact

The end result of search engines efforts to curb search spam mean more relevant results for the user. It will also give smaller content-driven sites an opportunity to rank higher in search results that have previously been dominated by spam.

Tags:

Source: http://www.submitawebsite.com/blog/2011/02/what-to-expect-from-google-algorithm-changes.html

Howard Stern James Cameron James Patterson Jay Leno

Link Your Twitter and Facebook Accounts

Bring you social media universes together by linking your Facebook and�Twitter accounts. Once you?re logged into Facebook, look up Twitter in the search field in the header. The Twitter application should be the first populated suggestion; click on it.

You?ll have to grant Twitter access to your Facebook account, and allow Twitter to post tweets to your Facebook profile. This not only bypasses the�nuisance�of having to repost content to both social media outlets, but also lets you find and follow your Facebook friends that are also using Twitter.

Also check out Twitter for Pages for similar functionality tailored to your business?s Facebook Page.

Source: http://www.grovo.com/blog/2011/04/link-your-twitter-and-facebook-accounts/

Katy Perry Keeley Hazell Keith Urban Kelly Brook

jeudi 14 avril 2011

Columbus, OH: Google Analytics Seminars for Success

We?ll be taking our Google Analytics Training (Seminars for Success) to Columbus, OH,� on April 11-13, 2011.� It?s three days long:


  1. Day 1: Google Analytics 101 for marketers and analysts just getting started with GA
  2. Day 2: GA 201 for intermediate and advanced marketers/analysts
  3. Day 3: GA 301 for techies

Sometimes people ask me if they should go to the GA 101 or the GA 201.� So many people choose to go to both (without ever asking) that I usually reference the eval forms of those people (the ones who go to both.) They usually say that the 101 was a little easy for them, but that they needed the refresher and picked up some valuable things ? so I hope that helps those of you who are deciding.

The day starts at 8:00 with continental breakfast, 8:30 for classes. In addition to a couple of breaks throughout the day, we also have a break for lunch (which we serve.) There are handouts and jumpdrives with the data and wifi for all (so we always encourage people to bring their laptops.)

You can learn more about the courses here, or register for the training here.

Robbin

Related Posts

  1. Last Call: Google Analytics Training in Boston
  2. Google Analytics Training in NYC: June 8, 9 and/or 10, 2010
  3. Google Analytics Training (in NYC, June 2, 2009)
  4. New Google Analytics Training: October 3 in Washington, DC
  5. Google Analytics Training: May 12, 2009 in DC

Leave a Reply

Source: http://feedproxy.google.com/~r/lunametrics-blog/~3/Cz5qwGvwksw/

Venus Williams AC/DC Adam Sandler Adriana Lima

Twitter?s Buyout Denials

February 15th, 2011 by John

If you?ve kept your ear to the ground over the past week or so, then you?ll know that rumours of either Google or Facebook buying Twitter have reached fever pitch. But while reports are emerging saying initial negotiations have been held and that waters are being tested, Twitter chief executive, Dick Costolo, has recently said that that?s all they are at the moment ? rumours.

Mr. Costolo was speaking in Barcelona at the Mobile World Congress. He said, ??People write that stuff all the time? I don?t know where these things come from, it?s just a rumour.? What?s telling is that there were no denials or aggressive dismissals. With Twitter thought to be making losses because of staff additions and purchasing new data centres, Dick also suggested that the microblogging kings are looking at new ways of generating revenue, though there were no indications of what those methods may be.

Twitter are keeping their cards close to their chest

Also, to provide an insight into just why Twitter is being valued at upwards of $10 billion, Dick also highlighted statistics collected during this year?s Superbowl. ?At the end of this year?s Superbowl we saw 4,000 tweets per second. During the game there were sustained periods when it was 3,000 tweets per second. Just to give you some contrast, during the 2008 Superbowl we served 28 tweets per second.?

We can?t help but wonder what the implications would be if Google purchased Twitter and what it?d mean for SEO. It?d be a large social statement of intent from the search giants, and show they?re serious about moving into such a lucrative field. Facebook and Twitter together, though? What a social hub that would be!

From what Dick says, though, it doesn?t look to be happening anytime soon. But we?ve heard coy business denials before. Can Twitter really stand on its own two feet from a financial point of view? The next few months will be very telling indeed?

Link to us

If you want to link to this blog, copy and paste the following HTML code to your website.

Leave a Reply

Source: http://www.searchengineoptimization.co.uk/seo-blog/social-media-optimization/twitters-buyout-denials.html

Black Eyed Peas

(Digital) Content is King

With today?s social-media frenzy, digital content is more important than ever. Whether on a website, a blog, a Facebook update, a microsite, an online ad or a Tweet, digital content is king.

Why go digital? Because online content has the power to spread like wildfire. It?s one of the reasons that press releases have experienced a renaissance on the Internet. If your press release, YouTube video or Facebook update goes viral, you could see an influx of hundreds or thousands or millions of visitors.

The best part? Online content is one of the only forms of media that can take on different forms. Take Burberry?s The Art of the Trench microsite, for example. It claims to be a living celebration of the classic Burberry trench coat and the people who wear it, but is it an advertisement or social media? Is it marketing or entertainment? It?s almost chameleon-like in its ability to address the needs of the company and the consumer.

Social media is blurring the line between what?s advertising and what?s ?just for fun.? And customers are spreading the word about companies? products and services without the company having to do any of (or very little) work. When customers do the work for you and share your story with others, the results can be phenomenal. But if you?re going to do it, it?s important to keep up with your digital content so visitors don?t see tweets or blog entries that are a couple of months old. Also, search engines will show recent tweets/status updates, so some traffic can be generated by making a quick daily update part of your routine.

Another benefit of digital content is that you can get immediate feedback from your client base. Customers who are connected 24/7 will respond instantly to Facebook status updates, tweets, comments on your blog posts (especially if they have an RSS feed). And these same customers will share these updates with their friends, and their friends will share them with their friends and so on.

You get the point. Now get out there and post some digital content?stat!

Tags:

Source: http://www.submitawebsite.com/blog/2011/01/digital-content-is-king.html

James Cameron James Patterson Jay Leno Jay-Z

mercredi 13 avril 2011

I Hate When You Tweet about Yourself (and pretend not to)

Lots of digital ink has been spilled over Twitter sins. Tweeting what you ate for breakfast today, etc.� But that doesn?t compare to these three sins:

Making social plans in plain view of all your followers. Yes, I know, you want to show off that you are friends with important people. Yes, some will look up to you.� But ultimately ? it is rude. Or as my mother used to say decades ago, if you aren?t going to invite everyone in the class, don?t discuss your birthday party in school.

Retweeting the nice things that people say about you. Isn?t it wonderful that they say nice things about you? Do you really have to tarnish it by retweeting it and horror of horrors, blushing? When people retweet and write ?blush,? they are really saying, yes I know, what I am doing is obnoxious, so I will pretend to be embarrassed. Even if your list of followers is so much bigger than theirs is ? you diminish the value of their fine work.

Publicly thanking important people for calling out your name/blog/site in their tweets/blog/video etc. Of course, it is lovely to thank them. A nice little email works. A direct message might be possible.� For that matter, ?@bigshot, thanks? does the trick.� You can even use the bread and butter note that my mother taught me about so many years ago. But when you publicly thank a big name for a shout out, you are really saying, ?Look who thought I was important enough to talk about me in his/her tweets/blog/video!!� Aren?t I important??

Here?s what I don?t hate: People who say, ?Read my post. Come to my webinar. Check out my new tool.?� While too much of that isn?t great, I really see that in a different class than the above three categories, i.e.not so awful.� So why are they different? First, they don?t say, ?Read my post, the most excellent in the world.?� (Or at least, not that I?ve seen.) Second, they don?t wrap their self-promotion in the guise of humility, in the guise of just politely thanking bigshot for the shoutout, in the guise of just making plans. Those who tweet, ?Read my post? are honest about it.

Robbin

Related Posts

  1. I'm tired of the blog tag game ? Can we change it?
  2. What I learned from my analytics
  3. Using Bit.ly for Spying, Link Building and Happiness
  4. Leveraging Social Media Links to Drive Traffic Part 2
  5. Just because you paid a lot of money for that website doesn't mean it's gonna convert

9 Responses to ?I Hate When You Tweet about Yourself (and pretend not to)?

ha, i was thinking of writing a similar post myself, but you beat me to it!!

i would also add that it is extremely rude to tell the whole world about fabulous vacations, expensive new gadgets, and how much money you just spent?no one wants to hear you brag, and not everyone is in your same position ? remember that!

if you wouldn?t stand up and say something in front of a random crowd of people, (some may be close friends, some may be acquaintances, some may be strangers) then don?t say it on twitter.

otherwise, it?s *unfollow* for you!

The worst, Jami, is when I can?t unfollow, because they also say interesting things that I have to stay on top of. And, you know there is a website devoted to tweets like, ?Darn, can?t take Daddy?s jet out today? and similar. It is in Jim Sterne?s book, have to get the name.

Great post Robbin. I laughed out loud. If you behave like this in the real world you will lose your followers in no time, strange that people don?t get that online.

Isn?t it the truth?

On the Internet, nobody knows you?re a dog. But they can all tell you?re a son of a bitch!

Robbibn ? the website is http://tweetingtoohard.com/ Today it includes these gems:

We are young, attractive, talented, will be making six figures well before 30, and people buy us free drinks. Life is pretty okay.

I don?t just make stuff. I make stuff HAPPEN.

And ? to prove your memory and reality are in sync ? my favorite of the day:

OMG i was saying how i couldn?t afford the gas to fly daddy?s jet to the riviera this summer, and this barista totally rolled her eyes at me

Totally.

Great post. But i much more like twitter than fb. Hate peoples who all day spam my wall.

Thank you thank you thank you, a thousand times thank you! This is the first time someone has mentioned my primary pet peeve: Talking about all the fabulous things and trips you?re enjoying. I?m not sure when bragging became acceptable, but in my world, it?s still exceedingly rude.

Add to that the people on facebook that post a status and ?Like? their own status.

Ohh, i hate all social networks. When i came online see 1000 requests for crapy mafia wars, farms, poker etc. Skype is just for enough :D

Leave a Reply

Source: http://feedproxy.google.com/~r/lunametrics-blog/~3/lSFQgjBdc4Y/

Stephenie Meyer Steve Carell Steven Spielberg Taylor Swift

mardi 12 avril 2011

Leveraging Current Events When Link Baiting

Building a legitimate, natural looking backlink portfolio is one of the hardest parts of SEO. It is especially hard when the website you are link building to lacks linkable content. When your site lacks linkable content, you have to force the backlinks coming into your site, which 9/10 times never looks natural. Now, I?m not saying that you can?t rank by forcing links, because you definitely can, but when you supplement your forced links with natural links, your website will rank faster and have staying power for years to come.

One of my favorite ways to generate natural links coming into a website is by creating and publishing link bait. One form of link bait that I really enjoy and have a lot of success with is being one of the first to blog about current events happening in your industry.

With the roll out of Google Instant, the almighty search engine has gotten pretty close to indexing new content published in as little as a few minutes. Add some Diggs, Stumbles, and Tweets within the first 30-60 minutes and you?ve got instant visibility!

You?re probably thinking? But Greg, my industry is so boring, there is no way for me to find things to blog about, let alone current events. I beg to differ? Your industry is not boring, you?re boring! And you?re not thinking outside the box.

For example, I have a client who manufacturers wood flooring used on college basketball courts. What could my client possibly blog about that would be interesting? Let me give you a hint? there was some sort of a basketball tournament that was wrapped up in the last week. Imagine if you were one of the first who published a blog post announcing the winner of the NCAA Final Four tournament? You would not only drive loads of traffic, but people will consider you as a resource and link to you. It?s really a beautiful thing.

Let?s consider another example. I also have a client who manufacturers and sells wedding dresses. Again, what could my client say about wedding dresses that hasn?t already been said thousands of times before? What if my client started to blog about celebrity weddings, providing details and photos, and was one of the first to publish these posts? Same scenario as above? You will drive loads of traffic and people will link to you.

As I shared in my webinar a few weeks back, I leverage this exact strategy on a website I own that sells tungsten wedding rings. I blog about recent celebrity engagements and provide pictures of their engagement/wedding ring and I see huge success. I know it?s kind of embarrassing to share, but I watch E! News & read US Weekly on a regular basis to get the scoop on engagements so I can post the content as soon as possible. It?s truly all about being smarter than the next guy!

So, to get started, I would first identify the types of people, businesses, organizations, etc who would use your product or service. Then, figure out how they use your product or service. Once you know this, you should be able to come up with some clever ways to produce content that would be interesting for people, especially if you can tie it back into a current event.

Good luck! Any questions/comments that add to the discussion are always encouraged and appreciated!


Tags: , , , ,

Source: http://www.seo.com/blog/leveraging-current-events-link-baiting/

Johnny Depp Jonas Brothers Judge Judy Sheindlin Julia Roberts

lundi 11 avril 2011

Importance of W3 Validation for SEO

A W3 validation plays an important role in SEO. The guidelines and rules set forth by World Wide Web Consortium (W3C) are based on the best online practices that each SEO must follow.�

By using the set standards and incorporating best practices, SEO experts can ensure that search engines can easily interpret websites and rank them higher. Further, it becomes relatively easier for search engines to alter the content-to-code ratio in the right manner and make relevant information accessible once a website is properly validated. As the excess code is removed from webpages, the keyword driven content gets higher priority. Another aspect of W3 validation that is usefulfrom a SEO perspective is that W3 compliance ensures that a website can easily be crawled and spidered.

Other than this, W3 Validation also makes sure that your website is user-friendly and accessible even to the disabled. Users can access your website through different devices, such as cell phones and PDAs and on different browsers, including Mozilla Firefox. Even if your website has different functionalities and uses different technologies and browsers, W3 validation can make it function across different platforms in a similar manner.

How much Validation is Importantfor SEO?

Being the foremost goal of SEO, getting highest organic rankings by factoring in Google alone requires consideration of about 200 indicators. In addition to these indicators, there are other factors to consider as well that can yield the desired results. That is why it is difficult to do 100% W3 validation on websites. Even though it is ideal to make sites 100% W3 compliant, but that is basically the job of web developers and not of SEO experts.While it only takes few minutes to run validation tests, but doing so at an SEO level can easily take hours and even more if you have to mention details on how to deal with those problems. Therefore, doing 100% validation is not possible and perhaps not even required in search engine optimization, since it is not that straightforward.

W3C Validation Standards for SEO

Apart from image alternate attributes, different HTML standard tags are considered as the best SEO practices, when used properly. Elements such as header tags, bolding, bullet points are all important standards to focus on from an SEO point of view.

W3 validation is highly relevant to ensure that search engines properly index webpages in their results. Without proper validation process, no search engine will be able to index your website.

Blog Homepage | Top of Page

Source: http://www.seogodfather.com/blog/importance-of-w3-validation-for-seo/

Venus Williams AC/DC Adam Sandler Adriana Lima

dimanche 10 avril 2011

6 Ways the Google Analytics Dashboard is Better Than You Think

The Google Analytics dashboard doesn?t give you a lot of options. You click the ?Add to Dashboard? button in a report and you get what you get: depending on the report, it might be a pie chart, or a graph of the metric over time, or the top 5 rows of the report. It?s OK for a quick snapshot, but it?s not very customizable. In fact, there are lots of third-party tools to build a better dashboard.

But the GA dashboard just might be better than you think. A few of us were sitting around the table at LunaMetrics yesterday and the topic came up, and we made a few observations of how useful the dashboard can be.

1. Save a filter

You already know you can navigate to any of the reports and add them to your Dashboard, rearrange them, and get rid of the ones you don?t want.

But did you realize that if you filter the report (using the containing/excluding filter at the bottom of a report, or the Advanced Filter), when you add that report to the Dashboard, that filter is saved? For example, if I filter the Keywords report to exclude ?luna? (to show just our non-branded keywords) and then add the report to the Dashboard, I see just the non-branded keywords.

Filter
Keywords filtered

This means, for example, that you can have multiple Top Content reports in your Dashboard, each filtered for different sets of pages. Or multiple Keywords reports, each filtered for different keywords.

The not-so-nice part is, the label of the report doesn?t say anything about that, just ?Keywords? for all of them. See #4 below for a trick to help in this department.

2. Drill down

If my business is selling widgets in a storefront in Schenectady, I probably don?t need the world map from the Map Overlay report on my Dashboard.

But fortunately, I can drill down at whatever level I like ? continent, country, state, city ? and add the drilled down report to the dashboard.

Map Overlay drilled down to Pennsylvania

This works for any report: drill down to a specific traffic source, a specific page, whatever you like.

3. It?s a permalink to the report

The really nice thing about #1 and #2 is not just that you get the report on the Dashboard. It?s that ?View Report? link at the bottom of the widget, which takes you directly to the report. It?s a good way of saving reports you go back to over and over. Instead of taking 5 clicks to navigate there and apply a filter or drill down, you get there in 1 click straight from the Dashboard.

4. Name reports better with Custom Reports (kudos to John Henson for pointing this one out)

Using a Custom Report allows you to change the name of the report so that instead of seeing ?Top Content? for multiple reports, you could see ?Articles by Tim?, ?Articles by Tom?, etc.

By simply recreating the report you are interested in as a Custom Report, you can name it whatever you want. For example, you could create a Custom Report named ?Articles by Tim?, filter that report to only show Tim?s articles, then add it to the Dashboard. Then create another Custom Report (which may well be exactly the same except for the name and filter) named ?Articles by Tom?, filter that report to only show Tom?s articles, and add that to the Dashboard as well.

What you end up with is a much easier to read Dashboard report. The titles are much more descriptive and easier to identify.

5. Apply an Advanced Segment across lots of reports at once (kudos to Dorcas Alexander for this one)

Advanced Segments are great for looking at just the audience that you?re interested in, and especially useful because you can use them across almost all the reports in Google Analytics to see just that audience. But if I want a summary of information, I might be stuck applying that Advanced segment and compiling data from several different reports.

You can also apply Advanced Segments from the Dashboard to see a quick view of all the reports segmented at once. This can make getting the picture of what your segment is doing that much quicker and easier.

6. Export and email the Dashboard

Instead of maintaining emails for multiple reports (even with the ?Add to Existing? feature, this can sometimes be a pain), you can also just choose to email the Dashboard report, and you get an attachment with data from all the reports you?ve added to the Dashboard. Now, if you want to add a report to your scheduled email, just add it to the Dashboard and you?re done.

Related Posts

  1. New Features in Google Analytics!
  2. New Google Analytics Interface and New t-shirts
  3. Segment your Goal Funnel in Google Analytics
  4. The New Google Analytics: Ready for Enterprise
  5. Custom Report for AdWords

6 Responses to ?6 Ways the Google Analytics Dashboard is Better Than You Think?

The one thing I don?t like about the dashboard is when you set it up as a scheduled email, I can?t add other reports to the same email.

I?ve set up several dashboards to be emailed. Then, when I want to add something else to the email report, I go into Set up an Email and Add to Existing and the dashboard email is not there.

Anyone have any insight on this?

I appreciate that the Google Analytics dashboard does a good job, but I still think third-party tools are much better. The link you provided to the Google Apps marketplace showcases the best ones ? Bime at the top of the list is really cool and definitely worth a try for anyone wanting to get a more competitive insight into their web analytics.

It helps to know that I can make the dashboard work for me and adapt it to my personal preferences just by being a bit smart about it. Interesting.

First surprise result: best product are of Microsoft!

Advanced segments are a great aspect of GA. Segmenting out non-branded traffic, particular areas of the site such as the blog, visitors who?ve returned to the site more than a certain number of times in a period, etc, all give you incredible insight into behavior that is actionable. Many thanks for sharing this.

Great ideas, I tend to get lost in GA often and I?m always wanting the content tab with top posts.
Thanks!

Leave a Reply

Source: http://feedproxy.google.com/~r/lunametrics-blog/~3/ZFNm92GrriQ/

Daniel Radcliffe David Beckham David Letterman Derek Jeter

samedi 9 avril 2011

Taking Apart the Meta Robots Tag

March 17th, 2011 by John

A lot of site owners avoid the Meta robots tag and their robots.txt files, simply because they don?t understand either of them. As a part of code, both can be shrouded in mystery to the average non-tech business owner. They really aren?t all that much of a mystery, though, and they can be used to assist your site?s search engine optimisation.

First things first: what is the Meta robots tag? The Meta robots tag is a part of your code that communicates with the search engines. Just as it says on the box, it is an HTML Meta tag that communicates with robots. Robots don?t always do what it says, notably malware robots that ignore it to get on with their malicious deeds, but it?s an important part of your site?s communication with the non-human forces on the net. The robots.txt file, on the other hand, is a file in your root directory. It does pretty much the same thing, and is effective on all robots.

Optimising all areas of your site is crucial for good SEO

Issue 1: Meta robots tag vs. robots.txt

Both of these communicate with robots, and are practically the same. The differences are significant, though. Some webmasters recommend the Meta robots tag because you don?t have to access the root directory to alter it. This makes it helpful if you don?t have access. On the other hand, many recommend using robots.txt because it is more reliable, and prevents crawling by search engines as well as preventing indexing. If you?re using one, you don?t need to use the other.

Issue 2: Why would you need to use either the Meta robots tag or robots.txt?

A lot of sites choose to leave these alone, but both the Meta robots tag and robots.txt can be used to sculpt the search engines? communication with your site. For example, if you have pages you don?t want Google to bother about, then a simple instruction not to index will mean the page is left alone. If you do this in your robots.txt file, the search engine spiders will leave it alone permanently, whereas if this command is in your Meta robots tag they will come back to check whether you?ve changed your mind.

Other things that can be handy in terms of search engine optimisation include the use of the nofollow tag, the instruction not to archive, which prevents a cached version appearing in the SERPs, and the instruction not to take snippets. When you have an instruction specific for Google in your Meta robots tag, you can address it to Googlebot.

Issue 3: Seriously? Using ?nofollow? in my own pages??

Using nofollows can help you to sculpt the search engines? experience of your site, for example when you want more attention paid to a particular internal link. The Meta robots tag and your robots.txt file can help your site, and they can hinder it. If you?re at all uncertain about what effect the changes you?re making to your code are going to have, it?s best to consult your SEO company.

Link to us

If you want to link to this blog, copy and paste the following HTML code to your website.

Leave a Reply

Source: http://www.searchengineoptimization.co.uk/seo-blog/on-page-seo/taking-apart-the-meta-robots-tag.html

Keeley Hazell Keith Urban Kelly Brook Kenny Chesney

vendredi 8 avril 2011

Importance of W3 Validation for SEO

A W3 validation plays an important role in SEO. The guidelines and rules set forth by World Wide Web Consortium (W3C) are based on the best online practices that each SEO must follow.�

By using the set standards and incorporating best practices, SEO experts can ensure that search engines can easily interpret websites and rank them higher. Further, it becomes relatively easier for search engines to alter the content-to-code ratio in the right manner and make relevant information accessible once a website is properly validated. As the excess code is removed from webpages, the keyword driven content gets higher priority. Another aspect of W3 validation that is usefulfrom a SEO perspective is that W3 compliance ensures that a website can easily be crawled and spidered.

Other than this, W3 Validation also makes sure that your website is user-friendly and accessible even to the disabled. Users can access your website through different devices, such as cell phones and PDAs and on different browsers, including Mozilla Firefox. Even if your website has different functionalities and uses different technologies and browsers, W3 validation can make it function across different platforms in a similar manner.

How much Validation is Importantfor SEO?

Being the foremost goal of SEO, getting highest organic rankings by factoring in Google alone requires consideration of about 200 indicators. In addition to these indicators, there are other factors to consider as well that can yield the desired results. That is why it is difficult to do 100% W3 validation on websites. Even though it is ideal to make sites 100% W3 compliant, but that is basically the job of web developers and not of SEO experts.While it only takes few minutes to run validation tests, but doing so at an SEO level can easily take hours and even more if you have to mention details on how to deal with those problems. Therefore, doing 100% validation is not possible and perhaps not even required in search engine optimization, since it is not that straightforward.

W3C Validation Standards for SEO

Apart from image alternate attributes, different HTML standard tags are considered as the best SEO practices, when used properly. Elements such as header tags, bolding, bullet points are all important standards to focus on from an SEO point of view.

W3 validation is highly relevant to ensure that search engines properly index webpages in their results. Without proper validation process, no search engine will be able to index your website.

Blog Homepage | Top of Page

Source: http://www.seogodfather.com/blog/importance-of-w3-validation-for-seo/

AC/DC Adam Sandler Adriana Lima Aishwarya Rai

jeudi 7 avril 2011

Paid links: do you still have to worry about them?

Once again, paid links are a hot topic in the search engine optimization community. The website of J. C. Penney had number 1 rankings for many competitive keywords. It turned out that the J. C. Penney website obtained these rankings through buying links on over 2000 pages.

The paid links were reported to Google and many of J. C. Penney's rankings dropped from number 1 to number 70 and below.

What are paid links?

If you pay the webmaster of another site to link to your website, then the link is a paid link. Paid links can be used to advertise your website on other sites. As long as the paid links use the rel=nofollow attribute, Google doesn't have any problems with them.

The problem arises when paid links are used to get higher rankings in the regular search results on Google.

Google is very clear about paid links

Google does not like paid links. According to Google's official statement, you should avoid paid links at all costs:

"[Some] webmasters engage in the practice of buying and selling links that pass PageRank, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. Buying or selling links that pass PageRank is in violation of Google's Webmaster Guidelines and can negatively impact a site's ranking in search results."

Google even has an official form that enables you to report paid links to Google:

"If you know of a site that buys or sells links, please tell us by filling out the fields below. We'll investigate your submissions, and we'll use your data to improve our algorithmic detection of paid links."

Should you use paid links to promote your website?

The problem with paid links is that they work. As long as nobody notices that you're buying links, paid links can have a positive effect on the search engine rankings of your website. However, as soon as Google detects the paid links your website can get in major trouble.

There are several problems with paid links:

  • A competitor might report your paid links to Google and your website will be penalized.
  • A competitor might buy links that point to your website, report them to Google and your website will be penalized.
  • A competitor buys links to a throwaway domain, sees where they appear, drops the links and waits for you to buy them. Then the competitor reports you to Google for buying links.

While paid links can improve your rankings, they are also extremely risky. If you plan to build a lasting business, you should avoid paid links. The potential damage exceeds the benefits by far.

Your website must have backlinks to get high rankings on Google

Backlinks are very important to get high rankings on Google. That's why Google works so hard on filtering the wrong kind of links.

The links that point to your website should be from related websites and they should contain the keywords for which you want to get high rankings. Do not manipulate the links to your website by buying links and do not join automated link systems to increase the number of links to your website.

If you want lasting results, focus on ethical search engine optimization methods. There are many ways to get good links (related websites, blogs, social bookmark sites, directories, etc.). IBP helps you to get them all.

Back to table of contents - Visit Axandra.com

Source: http://feedproxy.google.com/~r/free-seo-news/cXWx/~3/FMSU1b8H2nQ/newsletter460.htm

Brad Pitt Britney Spears Bruce Springsteen Cameron Diaz

Square One?s Shopper Marketing team talks changing retail landscape

Square One?s shopper marketing team had the privilege of assisting Dayton, OH reporter Caroline Shannon-Karasik, earlier this month with an article regarding the changing face of retail.� The article takes an in-depth look at how technology is changing how people shop, as well as where and when they shop.� We know how important it is to keep up with today?s shopper as technology continues to shake up traditional shopping patterns.� Traditional weekly grocery stock-up trips are a thing of the past as shoppers hone their multichannel shopping strategies to best suit their new daily routine.�� iPhone apps, social media, and digital shopping lists are quickly becoming part of the new ?normal?.�� As experts in both traditional and digital shopper marketing, Square One keeps our thumb on the of pulse on shopper behavior.

Congratulations on the article Caroline, we enjoyed working with you. � According to Caroline Shannon-Karasik,

Navigating the ?big business? of your local grocery store

A pretty girl in a curve-hugging dress or a shirtless man with sturdy biceps have eye appeal. So do photos of a sandy, sunny beach or a gorgeous sunset. But how about a basket of red, juicy apples? Or a just-made burger with cheddar cheese oozing down its sides, placed between a fresh, flaky roll?
That food and its eye appeal just went and got all sexy on you, right?
It might sound far off, but the truth is, you don?t have to be a foodie to notice that grocery stores have the corner on helping their foods to flaunt what they?ve got. Like a proud lion who puffs his chest, food stores are keen on the idea that a ?notice me!? presentation is the way to catch a consumer?s eye.
?Stack ?em high and watch ?em fly,? said Jill Moorhead, marketing director at the Hills Market in Columbus, referring to a popular line from the movie Reality Bites. ?Keeping a fully stocked display ? no matter how simple or complicated ? is the best way to move product. No one wants to buy the last loaf of bread or choose from a dwindling display of applesauce. Too many questions enter a consumer?s mind. The first being, ?How long has that been there??
It?s notions like this that have created a need for stores to address what might appear to be the smallest of details ? everything from the way in which a store is laid out to the products that are placed side by side to the first thing a consumer is greeted with when they walk in the door.
?Technology has forever changed the way people shop,? said Marian Leonard, vice president of shopper marketing at Square One Agency in Dallas. ?Eighty percent of shoppers come to the store informed about the products they are buying. It used to be that layout, specials and merchandising done in-store accounted for 70 percent of all purchase decisions ? not today. [Now] most decisions are made before people even enter the store.?
This, of course, leaves stores with a few things to be considered when working to stay successful. Two of the most important being: 1) ?What sets my store apart from its competitors?? and 2) ?What can I do to make the consumer?s shopping experience an easy one??
For the Dorothy Lane Market (DLM) Springboro store?s Store Director Ed Flohre, a stress-free shopping trip is the name of the game. It?s the strategy the DLM stores have successfully implemented since 1948.
?[Research shows the average customer feels] most grocery store trips fall in line with a visit to the dentist,? Flohre said. ?We really try to make it an experience that?s not that painful.?
Of course, the general consensus shows ?painful? is hardly the way the average DLM shopper would describe his shopping experience. Between the smell of just-baked bread when first entering, to the luscious displays of produce and prepared foods, the only thing that hurts is the hunger pangs a DLM display tends to dig up.
?We want you to come in and feast with your eyes and, hopefully, your nose,? Flohre said, pointing to the unique way in which DLM displays everything from produce to deli meats. ?Nothing looks like a grocery department in the sense that, when you go into other grocery stores, everything looks like little soldiers ? nice and neat. For us, that?s just boring. We want it to have a kind of flow, and change heights, colors and views, and give people a sense of ?Wow, that looks really good.??
Tactics that evoke those sort of feelings, Leonard said, are what allow a consumer to continue wanting to purchase from that particular store.
?The retail environment is constantly changing and has been since grocery stores were invented,? Leonard said. ?The most important thing for retailers to do is keep in mind the needs of shoppers. Shoppers are talking to retailers with every purchase they make.
?From evaluating the number of times they shop, seeing what they purchase, how much they buy, and how they use technology to shop, all give retailers a chance to study, get to know, and provide value to their shoppers. If they can do this successfully, they will continue to win.?
Bill Chidley, senior vice president of shopper sciences at Dayton-based Interbrand Design Forum said this notion of working to understand a shopper versus focusing solely on competing with one another has become increasingly important.
?Grocery stores tend to be very similar experiences from one to another, and historically they have sought to compete on price and quality of fresh food alone,? Chidley said. ?These things are important, but becoming harder to demonstrate a true difference. We help grocery chains understand what matters to shoppers beyond these two issues, and how to act on this understanding with the design and layout of the store, and their culture.?
Moorhead said all of these factors point back to the idea of simplicity with a task that many consumers have deemed a chore. Product groupings, for example, have proven widely successful for stores like DLM, who work to put together displays that encompass the makings of an entire meal. Pasta is paired next to marinara sauce, seafood with an appropriate wine or lemon herb dressing, and potatoes with a fresh Gruyere cheese for topping.
?Becoming a teaching source is always a good thing for the consumer, whether it?s a display of three food items and a recipe of how they can be worked together, or a cooking class or website feature,? Moorhead said. ?Groceries have a captive audience when it comes to cooking skills. How they choose to educate the consumer is up to them.?
Given the ever-changing and growing market, education is key for stores to thrive. Scott Sanders, vice president of New Jersey-based Bosco Products, Inc., said mass merchandisers and supercenters like Wal-Mart and Target, ?alternate channels? like drug stores and limited assortment stores such as Aldi or Trader Joe?s stores, are providing ?alternative venues for shoppers to buy groceries, as evidenced by a falling number of traditional grocery trips per household per year over the past 30 years.?
What that means for the average grocery store is a push to up the ante, Moorhead said.
?[The retail environment] is much more interactive now,? said Moorhead. ?We?re no longer a place to buy cereal and peanut butter. Groceries become wine bars and a place to learn about cooking. Thirty years ago, customer service was inside the store and maybe in the parking lot. Now, customer service extends to Facebook, Twitter and beyond.?
Leonard agreed, adding, ?It is important to keep in mind that executing traditional in-store tactics is no longer enough to reach shoppers anymore. They should be layered into a more holistic approach that is designed to provide value to their lives at any point that they are in shopping mode.?
Social media tools and digital display ads are ideal for reaching out to consumers when they are at home, Leonard said. While shoppers are out and about, mobile shopping apps that help search for coupons and product reviews or find retailer locations are helpful.
?Lastly, when they are in the actual store environment and they are making actual purchase decisions, there is an opportunity to reach them with the traditional elements such as display, coupons, bonus packs, recipes and so forth.?
Rachael Betzler, Kroger public relations manager for the Cincinnati/Dayton marketing area, said these factors were something that Kroger stores were sure to consider when deciding the layout of their stores and what would go into them.
?Customers tell us what they want for a better shopping experience,? Betzler said. ?They want convenience. We have added a lot of the prepared foods such as stuffed pork chops or marinated meats.� Something easy for the customer to take home and pop in the oven.
?[They?ve told us they] want to drink coffee while they shop, so we have added coffee shops. Having everything under one roof means less trips, which is even more important these days with the price of fuel.?
For some stores, like Health Foods Unlimited (HFU) in Centerville, quite the opposite is true. Instead, the specialty store has chosen to keep it simple for shoppers by slimming down extras and has successfully done so for more than 30 years.
?We try and provide a clean, bright and organized store to make it easier for our customer to make a proper selection for their needs,? said HFU Manager and Part-Owner Emilie Miller. ?This may seem basic, but we have a lot of product and it can get overwhelming.?
This is proof that what works for one store may not be the best policy for others, leading stores to discover what sets them apart from others and capitalize on those strong points.
?As things change we?re going up against businesses that have 1,000 to 2,000 stores ? there?s no way we can match all of their prices,? DLM?s Flohre said, adding that, instead, the store has capitalized on providing items other stores don?t carry, such as imported foods and gourmet products. ?We decided we wanted to go with our strengths which was the love of food and great customer service.?
While some stores might lean on gimmicks or misleading advertising to sell products, venues like DLM are pushing through the cracks.
?There are always ways to entice the consumer while being honest,? Moorhead said. ?A store can purchase a machine that shoots scents of orange into a produce department; we?d rather just open up an orange and divide it up into samples.?
This way of thinking gives way to a whole new meaning behind ?buyer beware,? so take caution: If you?re not careful, you just might enjoy your next grocery store trip, ?feasting,? before you even hit the checkout line.



Source: http://www.sq1agency.com/blog/?p=2588

Tyler Perry U2 Venus Williams AC/DC

lundi 4 avril 2011

Official: Google launches a new anti-spam algorithm against content farms

Back to our newsletter archive

Welcome to the latest issue of the Search Engine Facts newsletter.

Earlier this month, Google announced that they would release several new anti-spam algorithms this year. The first algorithm update has just been released and it deals with content farms. How does this affect your website and what do you have to change on your web pages?

Also in the news: some content farm providers claim to have arrangements with Google, local and mobile advertising becomes more important, Google deletes ads in Google Maps and more.

Table of contents:

We hope that you enjoy this newsletter and that it helps you to get more out of your website. Please pass this newsletter on to your friends.

Best regards,
Andre Voget, Johannes Selbach, Axandra CEO

Earlier this month, Google announced that they will release several new anti-spam algorithms this year. The first algorithm update has just been released and it deals with content farms.

What are content farms?

There are two slightly different definitions of content farms:

  1. Content farms are scraper sites that aggregate the content from other sources to get high rankings for a variety of long tail keywords. These sites don't have unique content and they only aggregate the content from other websites to get clicks on their AdSense ads.

  2. Content farms are websites that produce low quality content in bulk. This content is often produced by workers from low-wage countries. The main purpose of these sites is to get high rankings for as many keywords as possible to get clicks on the AdSense ads that are displayed on the site.

Sites that copy the content from other websites often ranked higher than the original site in Google's previous algorithm. That's why Google released the algorithm update.

Google's Matt Cutts confirmed the new algorithm:

"[I mentioned] that 'we're evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others' content and sites with low levels of original content.'

That change was approved at our weekly quality launch meeting last Thursday and launched earlier this week."

Less than 0.5% of search queries have significantly different results

Matt Cutts also said that most surfers won't notice the change:

"This was a pretty targeted launch: slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice.

The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site's content."

The algorithm doesn't seem to be perfect yet

In an online discussion, many webmasters complained that their original sites have suffered while low quality sites still rank well:

"It doesn't matter if it is all 100% unique with tons of backlinks and really well laid out or simply an image. Everything got whacked.

Our out of date non unique ad filled sites are humming along FINE. So what's the message here Google? Write an in depth article that takes 3 days to complete and is linked to by hundreds of companies and gov agencies and loose all positions sitewide while our out of date halfbaked and useless content does fine?"

Should you change your web pages?

Google doesn't like spam. If you want to get lasting rankings on Google, you have to do three things:

Spammy SEO techniques often deliver quick results but they only work for a short time:

If you plan to build a lasting online business, focus on ethical search engine optimization methods. It takes longer to get high rankings with ethical SEO methods but your website will keep the high rankings for a much longer time:

The content farm algorithm update was only the first step. Google will release many more anti-spam algorithms this year. If you want to be successful with your website, you must make sure that you offer a useful site and that you offer Google the content that they want.

Back to table of contents - Visit Axandra.com

Google's John Mueller: what to do if your website has been downranked

In an online discussion, Google's John Mueller says what you can do if your website has been downranked after Google's latest algorithm update:

"One thing that is very important to our users (and algorithms) is high-quality, unique and compelling content. Looking through that site, I have a hard time finding content that is only available on the site itself.

If you do have such high-quality, unique and compelling content, I'd recommend separating it from the auto-generated rest of the site, and making sure that the auto-generated part is blocked from crawling and indexing, so that search engines can focus on what makes your site unique and valuable to users world-wide. "



DemandWhat are Demand Media's secret ?agreements? with Google?

"For those of you that haven't heard of the company they are a content farm using outsourced writers to produce 5000 articles per day on sites such as eHow and various other informational resources. [...]

Google declared war on 'content farms' last week but refused to comment on Demand Media specifically. [...] I certainly won't be buying shares in Demand Media anytime soon, unless they can let us have a look at that agreement with Google perhaps."

Google: local and mobile advertising becomes more important

"In Local, over 5 million businesses have claimed their Google place pages. [...]

Click-to-Call ads are generating millions of calls every month. A lot of advertisers are running these campaigns. I think one you can see if you tried is DirectTV. We did launch a call-only option where the only clickable link in the ad is actually a phone number, which not surprisingly substantially increases the click through rates on mobile devices."



Google Local Google drops real estate search listings

"In part due to low usage, the proliferation of excellent property-search tools on real estate websites, and the infrastructure challenge posed by the impending retirement of the Google Base API (used by listing providers to submit listings), we've decided to discontinue the real estate feature within Google Maps on February 10, 2011."

Google's Schmidt wants to set the record straight

"Eric Schmidt opens up about his company's executive shuffle, whether or not Facebook poses a threat, and where Google is competing with Apple. [...]

This has nothing to do with competitors. I publicly said the next 10 years will be as successful as the past 10. We're going to run this way for a while. It's a full-time job just to deal with."

Search engine newslets

  • Rumor: Twitter self-serve ad platform coming next year.
  • Google to acquire fflick for $10 million.
  • Google adds snow conditions results and improved weather forecasts for mobile.
  • Has Larry Page doomed Google? Larry Page's Google 3.0.
  • Ed Vaizey hears MP's concerns over Google monopoly.
  • Bing feature update: compare travel destinations.
  • Google executive missing after Egypt protests.

Back to table of contents - Visit Axandra.com

300,000 readers will read your success story!

Let us know how IBP has helped you to improve your website and we might publish your success story with a link to your website in this newsletter. The more detailed your story is, the better. Click here to tell us your story.

Here's an example:

"We now have several page 1 entries in Google, Bing and Yahoo!"

"We have been using IBP since launching our site. It has proved to be successful for us. We now have several page 1 entries in Google and Yahoo.

In fact we have the number one slot on Yahoo for one of our main key phrases. We found all of the software very useful, particularly the ranking tools, the optimizing section and the submission pages.

We started out looking at our competition and seeing the comparison to our site through the optimizer with the top 10 report. This was able to show how much we differed from the sites above us and offered recommendations to improve our position.

Once we put as much as we could into practice, the results started to come. Now we keep an eye on our positions in the search engines using the ranking checker. It enables you to see your improvements and any downward movements. We can recommend this software."
Steve Sewell, www.PerfectFloridaVillas.com

Back to table of contents - Visit Axandra.com

Back to table of contents - Visit Axandra.com

The Search Engine Facts newsletter is free. Please recommend it to someone you know.

You may publish one of the articles above on your Web site. However, you must not change the contents in any way. Also, you must keep all links and you must add the following two lines with a link to www.Axandra.com: "Copyright by Axandra.com. Web site promotion software."

All product names, copyrights and trademarks mentioned in this newsletter are owned by their respective trademark and copyright holders.

Back issues:
http://www.free-seo-news.com

Back to our newsletter archive RSS feed for weekly search engine ranking facts

Source: http://feedproxy.google.com/~r/free-seo-news/cXWx/~3/ADhUag83SCg/newsletter458.htm

Meagan Fox Megan Megan Fox Meryl Streep