Friday, July 31, 2009

Yahoo gets hard look from investors following deal


SAN FRANCISCO (Reuters) - Yahoo Inc shares fell for a second straight day after the company announced a Web search partnership with Microsoft Corp that failed to live up to Wall Street expectations.

Even as the broader market rallied, Yahoo shares dropped as much as 6 percent on Thursday. The stock recovered a little in late trading, but is still down 14 percent since the agreement was announced on Wednesday morning.

The deal, in which Yahoo essentially outsources search to Microsoft, is being viewed by analysts as more positive for the software company than for Yahoo. Microsoft's stock rose about 1 percent on Thursday.

"Everybody was anticipating something much more than this," said Bob Bacarella, portfolio manager at the Monetta Fund, which does not own Yahoo shares. "To me it's like Yahoo sold their soul, but they still don't have a body."

Some investors had hoped the deal would help pave the way for an acquisition of Yahoo by Microsoft, which offered as much as $33 a share to buy the company last year. Yahoo shares were down 3 percent at $14.65 in Thursday afternoon trading.

Others were disappointed that Microsoft did not agree to pay Yahoo an upfront payment, which some had expected to be several billion dollars.

With the prospect of a Microsoft acquisition no longer on the table, investors are taking a harder look at Yahoo's fundamentals, said Mark Coffelt, a portfolio manager at the Empiric Core Equity Fund.

"People are reassessing why they own Yahoo and where it's going to trade and how fast it's going to grow," said Coffelt.

Yahoo currently trades at around 34 times historic earnings, compared with Google Inc's 21 times multiple.

"It's priced to see the company grow really rapidly and there's certainly no evidence of that in the last few years," Coffelt said.

Cowen & Co analyst Jim Friedland said the PE ratio does not tell the entire story on account of Yahoo's significant holdings in Asian assets, including the Alibaba Group, which owns Alibaba.com Ltd.

Looking at Yahoo from an enterprise value to EBITDA (earnings before interest, tax, depreciation and amortization) ratio basis, he said the stock has valuation support. But that does not mean it has any upside in the near term.

"Just because something trades at a low multiple doesn't meant there's not a good reason for it," said Friedland.

Microsoft Chief Executive Steve Ballmer said on Thursday the deal was not understood in the market.

"It's a win-win deal from my perspective," he said, adding he was surprised by the steep fall in Yahoo's shares.

source: http://www.reuters.com/article/technologyNews/idUSTRE56T6H220090731

Thursday, July 30, 2009

Microsoft and Yahoo Finally Announce Deal

Well, they’ve finally gone and done it. Microsoft and Yahoo have partnered to “change the search landscape.” The two companies today announced a long-awaited deal, in which Microsoft will power Yahoo search while Yahoo will become the exclusive search advertising provider for Microsoft’s search engine, Bing.
You know this history by now. Here’s what the companies have to say:
Carol Bartz
”This agreement comes with boatloads of value for Yahoo!, our users, and the industry, and I believe it establishes the foundation for a new era of Internet innovation and development,” said Yahoo! CEO Carol Bartz. “Users will continue to experience search as a vital part of their Yahoo! experiences and will enjoy increased innovation thanks to the scale and resources this deal provides. Advertisers will also benefit from scale and enjoy greater ease of use and efficiencies working with a single platform and sales team for premium advertisers. Finally, this deal will help us increase our investments in priority areas in winning audience properties, display advertising capabilities and mobile experiences.”
Steve Ballmer
Microsoft CEO Steve Ballmer said, “Through this agreement with Yahoo!, we will create more innovation in search, better value for advertisers and real consumer choice in a market currently dominated by a single company,” said Ballmer. “Success in search requires both innovation and scale. With our new Bing search platform, we’ve created breakthrough innovation and features. This agreement with Yahoo! will provide the scale we need to deliver even more rapid advances in relevancy and usefulness. Microsoft and Yahoo! know there’s so much more that search could be. This agreement gives us the scale and resources to create the future of search.”

Key Terms as highlighted in the announcement:

- The term of the agreement is 10 years;

- Microsoft will acquire an exclusive 10 year license to Yahoo!’s core search technologies, and Microsoft will have the ability to integrate Yahoo! search technologies into its existing Web search platforms;

- Microsoft’s Bing will be the exclusive algorithmic search and paid search platform for Yahoo! sites. Yahoo! will continue to use its technology and data in other areas of its business such as enhancing display advertising technology;

- Yahoo! will become the exclusive worldwide relationship sales force for both companies’ premium search advertisers. Self-serve advertising for both companies will be fulfilled by Microsoft’s AdCenter platform, and prices for all search ads will continue to be set by AdCenter’s automated auction process;

- Each company will maintain its own separate display advertising business and sales force;

- Yahoo! will innovate and “own” the user experience on Yahoo! properties, including the user experience for search, even though it will be powered by Microsoft technology;

- Microsoft will compensate Yahoo! through a revenue sharing agreement on traffic generated on Yahoo!’s network of both owned and operated (O&O) and affiliate sites;

- Microsoft will pay traffic acquisition costs (TAC) to Yahoo! at an initial rate of 88 percent of search revenue generated on Yahoo!’s O&O sites during the first five years of the agreement; and

– Yahoo! will continue to syndicate its existing search affiliate partnerships.

- Microsoft will guarantee Yahoo!’s O&O revenue per search (RPS) in each country for the first 18 months following initial implementation in that country;

- At full implementation (expected to occur within 24 months following regulatory approval), Yahoo! estimates, based on current levels of revenue and current operating expenses, that this agreement will provide a benefit to annual GAAP operating income of approximately $500 million and capital expenditure savings of approximately $200 million. Yahoo! also estimates that this agreement will provide a benefit to annual operating cash flow of approximately $275 million; and

- The agreement protects consumer privacy by limiting the data shared between the companies to the minimum necessary to operate and improve the combined search platform, and restricts the use of search data shared between the companies. The agreement maintains the industry-leading privacy practices that each company follows today.

About the author:
Chris Crum has been a part of the WebProNews team and the iEntry Network of B2B Publications since 2003.

Wednesday, July 29, 2009

Microsoft releases Windows 7 code to PC makers

r

SEATTLE (Reuters) – Microsoft Corp said on Wednesday it is releasing the code for Windows 7 to PC manufacturers, keeping the software company on track to have machines running its new operating system in the stores by late October.

The move means Hewlett-Packard Co, Dell Inc, Acer Inc and other computer makers can start to load up new PCs, laptops and netbooks with the operating system, the successor to the unpopular Vista.

Both Microsoft and the manufacturers are hoping the full launch of Windows 7, scheduled for Oct. 22, will help lift PC sales out of the slump caused by the global economic downturn, and give the holiday shopping season an extra lift.

Manufacturers have been testing early versions of Windows 7 for several months, but this week marks the release of the “gold code,” according to a Lenovo Group Ltd executive, referring to the software industry jargon for the finished product.

PC makers no longer have to fly discs in helicopters to their manufacturing plants, as the transfer is now done electronically. But it still marks a dramatic day as manufacturers hustle to get new products into stores in time for the release date.

Machines that have Windows 7 installed, or devices that are compatible with it, will simply have the Windows 7 logo on them, a Microsoft executive said. The company will not be splashing the word “capable” around in marketing efforts, after it received complaints at its last launch that some machines branded “Windows Vista Capable” could only run the lower-end versions of the software.

Few industry watchers expect such problems to hit Microsoft this time around as the company has spent more time making sure PCs will be able to run the new software.

Monday, July 20, 2009

Strange Ad Copy for Google Adwords

Today I came across an absolutely strange Adwords ad in my Gmail interface. I am absolutely clueless why would any advertiser put up an ad like that and even if they do how can the Adwords editorial team approve such an ad copy. On top of my Gmail inbox the Adwords ad displayed as

Text ad - www.url.com - Desc1 Desc2

Just to satisfy my curiosity I clicked on the ad to find that it leads to a page that apparently doesn't exist, however, I was even more surprised by the fact that the landing page URL is completely tagged to keep a tab on the performance of the campaign, which indicates that the person who has set up the campaign definitely knows a bit or two about Adwords and PPC

Landing Page:
http://www.test.0000000184.com/path?keywords=keyword&creative=3849139695&adGroup=7711

While the ad copy mostly complies to the Google Editorial policy except for the fact that the landing page URL isn't working ( and that should get the ad disabled ) it is rather strange for any company to put up an ad like that.

There are just two possibilities that clicks my mind on this..

a) The ad is run by Google themselves to test any internal feature or gather any specific user behavior data ( by passing the editorial policy is not an issue for them)

b)Any major agency is running these ads for the purpose of finding out any user behavior data or calculating some other competitive matrix like minimum cpc, ctr or minimum impressions on specific keywords etc. However, in this case I am not sure how they have been able to bypass the Adwords editorial policy.

If you have any other ideas about the possible reasons for such an ad copy please do mention in the comments.

Thursday, July 16, 2009

Free Web Directories Lists

I have compiled a short list of Directory of Directories/Lists of SEO friendly directories that accepts your websites/directories for free.

1- Add URL.nu : Add your URL to free SEO friendly directories Listing 850+ Free Directories

2- A List Of 200 FREE Directories To Submit Your Blog To – Daily Blogging Tips and Web 2.0 Development 200+ Free Directories

3- Free Directory List – Directory Critic 1150+ Free Directories (growing each day)

4- Info Vilesilencer: The Original SEO Friendly Free Directory List Updated Frequently

5- Free Web Directory List List of Strong and Best Free Directories

6- Free Web Directories – Free Directory Submissions 260+ Free Directories

7- ManhattanService – Website promotion and search engine optimization » List of Free Directories (Over 250) 250+ Free Directories

8- SEO Reloaded » Search Engine Friendly Directories Another Great List sorted out with Free/Paid Directories

9- Free Submission List of Universal Search Engines and Directories

10- phpLD Network- General Free Directories

11- Free Directories with high PR. Search Engine Optimization. List of free directories with high PR. BestCatalog.net – Web Site Development Resources Directory. Free Directories with high PR

12- Directories – seo advantage – Free directories

13- 427 Free Directories Sorted By PageRank | Stephan Miller

14- Free Web Directories List from TSW Professional Seo Company Experts

15- Value Directories List | Free Directories List | Paid Directories List | Niche Directories List

16- Free Directories List – Directory Rate

17- Web Directories- General Directories > Free 1300+ Free Directories

18- Free Web Directories – List Of Websites That Offer Free One Way Link 1750+ Free Directories

19- Free Directory List – Free Web Directories

20- Free Directory Listings – webseodesign.com

21- http://www.zinginfotech.com/storders…irectories.php 990+ Free Directories

If you know any more List or Directory of Directories (free directories only) feel free to share on this thread please, I will keep on updating this thread.


Tuesday, July 14, 2009

botw promo code


Dear Valued Member,

Last year we introduced BOTW Local, a user-driven directory of businesses throughout the United States. The adoption rate has proven to be nothing short of astounding, and daily traffic and users continue to skyrocket. Business owners have flocked to the Premium offering, driving targeted users to their businesses and improving visibility in the major search engines. Meanwhile the BOTW Web directory continues to dominate site owners internet marketing dollars, providing unparalleled ROI.

Now for the first time ever, we are pleased to offer business owners an exciting opportunity to get both great products at one incredible price. The BOTW Bundled Submission combines a Local Premium Listing with a Web Directory Listing and saves you money! As a special offer to our loyal members, you can take advantage of this great product and SAVE an additional 50%!!

That’s right – for the remainder of July you can use the special promo code and take advantage of a web directory listing and a dedicated page about your business on BOTW Local for half price! 50% off the already discounted price makes this an offer you can’t afford to pass. Don’t miss out – prices will never be this low again. Two great products combined into one awesome, cost-saving bundle – and a 50% discount to blow out your ROI.

By combining these two products, you will not only save money, but you will also generate highly relevant leads to your business and increase your visibility in the major search engines – Bundle & Save Today!

When prompted, enter Promo Code: BUNDLE

If you already have a web directory listing and would like to take advantage of this special deal to have your business included in BOTW Local, please email us at bundled@botw.org for further instructions.

Thanks again for your support!

Best of the Web
http://botw.org/



Monday, July 13, 2009

read PDF without adobe reader

There is no dout in say that e-mail is the fast growing ans high speed tool to communicate with each other.This is tech savy time so allmost we have E-mail ID Gmail is one of the well knowan E-mail services provider.

If you have a Gmail ID then your friend,buisness partner,or family partner may be send you a PDF file in attachment,then its must you have install Adobe Reader in your persional computer to read that file but problem is that you have no install and don't want to install no problem :) now you can read a PDF file directly with out install Adobe Reader, how you can read it in your Gmail without Adobe Reader to read your PDF file click on view button .

You will see a “view” link in any email that contains a PDF attachment. The email has to actually contain the PDF and not just have a link to a PDF.

Thursday, July 9, 2009

Google takes aim at Microsoft with new PC platform

Google takes aim at Microsoft with new PC platform
By Alexei Oreskovic and Edwin Chan

SUN VALLEY, Idaho (Reuters) – Google Inc is planning a direct attack on Microsoft Corp’s core business by taking on the software giant’s globally dominant Windows operating system for personal computers.

Google, which already offers a suite of e-mail, Web and other software products that compete with Microsoft, said on Tuesday it would launch a new operating system that will initially be targeted at netbooks.

Called the Google Chrome Operating System, the new software will be in netbooks for consumers in the second half of 2010, Google said in a blog post, adding that it was working with multiple manufacturers.

Netbooks are low-cost notebook PCs optimised for Internet surfing and other Web-based applications.

“It’s been part of their culture to go after and remove Microsoft as a major holder of technology, and this is part of their strategy to do it,” said Rob Enderle, principal analyst at Enderle Group. “This could be very disruptive. If they can execute, Microsoft is vulnerable to an attack like this, and they know it,” he said.

Google and Microsoft have often locked horns over the years in a variety of markets, from Internet search to mobile software. It remains to be seen if Google can take market share away from Microsoft on its home turf, with Windows currently installed in more than 90 percent of the world’s PCs.

The news comes as executives from the world’s biggest technology and media companies, including Google and Microsoft, gather in Sun Valley, Idaho for an annual conference organized by boutique investment bank Allen & Co.

A spokesman for Microsoft had no immediate comment.

Key to success will be whether Google can lock in partnerships with PC makers, such as Hewlett-Packard Co and Dell Inc, which currently offer Windows on most of their product lines.

HP, the world’s largest PC brand, declined to confirm if it would sell PCs running on the new operating system.

“We are looking into it,” said HP spokeswoman Marlene Somsak, referring to the operating system. “We want to understand all the different operating systems available to customers, and will assess the impact of Chrome on the computer and communications industry.”

Google’s Chrome Internet browser, launched in late 2008, remains a distant fourth in the Web browser market, with a 1.2 percent share in February, according to market research firm Net Applications. Microsoft’s Internet Explorer continues to dominate with nearly 70 percent.

FAST AND LIGHTWEIGHT

The new Chrome OS is expected to work well with many of the company’s popular software applications, such as Gmail, Google Calendar and Google Maps.

It will be fast and lightweight, enabling users to access the Web in a few seconds, Google said. The new OS is based on open-source Linux code, which allows third-party developers to design compatible applications.

“The operating systems that browsers run on were designed in an era where there was no web,” Sundar Pichai, vice president of product management at Google, said in the blog post. The Chrome OS is “our attempt to re-think what operating systems should be”.

Google said Chrome OS was a new project, separate from its Android mobile operating software found in some smartphones. Acer Inc, the world’s No.3 PC brand, has already agreed to sell netbooks that run on Android to be released this quarter.

The new OS is designed to work with ARM and x86 chips, the main chip architectures in use in the market. Microsoft has previously said it would not support PCs running on ARM chips, allowing Google an opportunity to infiltrate that segment.

Charlene Li, partner at consulting company Altimeter Group, said Google’s new OS could initially appeal to consumers looking for a netbook-like device for Web surfing, rather than people who use desktop PCs for gaming or high-powered applications.

But eventually, the Google OS has the potential to scale up to larger, more powerful PCs, especially if it proves to run faster than Windows, she said.

Google did not say how much it would charge for the operating system (OS), but Enderle expects Google to charge at most a nominal fee or make it free, saying the company’s business model has been to earn revenue from connecting applications or advertising.

Microsoft declines to say how much it charges PC brands for Windows, but most analysts estimate about $20 for the older XP system and at least $150 for the current Vista system.

Li added: “A benefit to the consumer is that the cost saving is passed on, not having to pay for an OS.”

“It’s clearly positioned as a shot across the bow of Microsoft,” she said.

(Additional reporting by Kelvin Soh in Taipei)

Wednesday, July 8, 2009

Matt Cutts Talks Geo Tags and Webmaster Tools


Google’s Matt Cutts frequently posts useful tips for webmasters on the Google Webmaster Central YouTube channel. The short clips generally offer valuable nuggets of info that can have an impact on your site’s performance in Google.

In these videos, Matt always answers questions submitted by users, and in a recent one he answers the question: “How do meta geo tags influence search results?”

How To Find A Good Web Host?

How To Find A Good Web Host?
Usually when someone contacts me online, one of the most frequent questions I get asked is this: “How do I find a good web host?” or “Which web host do you use?”
This is one of those essential things you will need in order to earn that online income. You will need a web site and you will need a web host – somewhere to place that web site.

Sure there are countless marketers who work without a site but most have at least one main site which acts as their online calling card; where people can find them on the web. Of course, you can always use the social networks like Facebook, MySpace, Twitter… but that’s like operating your office out of Starbucks!

Instead, a well designed site is just more professional and business like. It also means you’re serious about your online business.

Picking a good web host is not easy. I have had numerous different web hosts over the years. Most of them have been very good and served their purpose, but I have had some bad experiences too – what long term webmaster or marketer hasn’t? Something always goes wrong eventually, especially if you have a lot of sites and many different web hosts.

Sometimes having a web host can be downright scary. Several years ago, the web host that I had my main site on was first sold and then went bankrupt. One day your site is running fine and the next day the web host is completely gone. Shut down. Your site and all your files completely gone. Vanished. Your host can’t be reached. Total blackout!

Fortunately, I had my site backed up on several systems and since that experience I always back my sites up on several computers and I go even further by copying them onto DVDs should those computers crash. If you have a site, I always suggest you back up your site files and make it redundant. The very worst can happen to you.

Overall quality and good 24/7 support should be your first objectives in choosing a good web host. You want a site that’s fast, easy to use, rarely down and has good support that you can reach at all times, should something go wrong.

You need to check exactly what features your web host is offering and for what price:

- How much storage space?
- Daily traffic limit or transfer?
- Email system? How many accounts…
- What type of server Linux or Windows?
- How many other sites will be sharing your IP address?
- Do You need SSL or a secure site?
- Are sites/files backed up? How often?
- What are the support hours?
- And of course the Price?

But don’t always try to get the cheapest price hosting, keep in mind, you always get what you pay for. A cheap web host won’t save you money if it’s a poor service and your site is always going down. So don’t always pick the cheapest web host.

Match up your hosting service with the type of site you will be running; a simple HTML site will have or make simple demands on your hosting service. So a shared hosting service may be quite adequate to meet your hosting needs. For SEO reasons, you should always check the location of your site, I have found which country your site is hosted in, plays a role in your rankings, especially in Google.

However, if you have a site that’s extremely interactive with forums, discussion groups, get large bursts of traffic or you’re running a lot of server-side scripts and programs – then you may need a more robust hosting service to meet your needs.

In this case, you might need a dedicated server to handle just your site. Many web hosts offer this service and it’s worth looking into if you have a site with extreme amounts of traffic or if you’re running forums, affiliate programs, email services… from your site. Most of my own sites are very simple and I have them on many different web hosts. Mainly because of my own experiences, I just don’t want to have all my sites on one host… the old “don’t put all your eggs in one basket” reasoning.

At the moment I am quite pleased with all my web hosts… many of my sites are on GoDaddy and I find them OK for simple sites like mine. I also find it convenient because they are also a domain registry so I can easily use them to buy my domains. Although many experts suggest you should always keep your domain registration separate from your hosting service because if your host should suddenly vanish, it is only a simple matter of moving your site to a different host. If your host controls your domain, this can be a major problem. Always keep control of your domain in your own hands, but you probably already knew that.

Another web host I use is Bluehost which is very good, can’t remember the last time my site was down. They are very popular with around a million sites and my only concern is that they may become too popular and their services will be spread too thin. However, I have had very few problems with them and you can always reach their support.

I also have a site with Ken Evoy’s SBI (Site Build It), but I created that one mainly to get access to the enormous resources connected with SBI. It is slightly more expensive than some of the ones listed above, but SBI is an overall online marketing system that in my opinion can’t be equaled on the web. Well, perhaps the Warriors group could give them a run for their money, but it’s the community of like-minded webmasters with SBI which makes it special. They are always ready and willing to help you out, doesn’t matter if you’re an experienced pro or a complete newcomer. Several years ago, I took a very close first-hand look at the hosting service provided by SBI. You can find my opinions/review on SBI located in the resource box below.

There are countless webhosts you can choose from when picking a web host. But do your homework, check around to the different forums and see how everyone is rating the web hosts they are using. First-hand experiences are the best judge of whether or not a web host is good and reliable.

Although moving a site from one web host to another can be a real pain, especially if you have a large scale site, but if you’re not totally satisfied with a web host and are having serious problems – simply change your hosting. Just make sure you’re not going from bad to worst.

Still, finding a good quality web host will be a major chore, no matter how you look at it.

Perhaps, in the final analysis, nothing beats checking with your friends and fellow webmasters you trust. Ask them which hosts they’re using and if they’re satisfied with it. Nothing beats first-hand experiences when it comes to choosing a web host. Just make sure you’re comparing oranges to oranges, that is: make sure you have similar site requirements as your webmaster friends. If you both have similar type sites, then finding a good quality web host can be as easy as having a friendly chat over a cup of coffee.

About the Author: Titus Hoskins is a full-time professional online marketer who has numerous niche websites. Here’s a review of Ken Evoy’s popular marketing/hosting system: Site Build It Review For the latest web marketing tools try: Internet Marketing Tools.

Geo Location and Search Engine Rankings

Source: googlewebmastercentral: As part of Google’s goal to make the web faster, we uploaded several video tips about optimizing the speed of your website. Check out the tutorials page to view the tutorials and associated videos.
Matt Cutts answered a new question each day from the Grab Bag:
And during Adam Lasnik’s visit to India, he was interviewed by Webmaster Help Forum guide Jayan Tharayil about issues related to webmasters in India. We have the full three-part interview right here.
Matt Cutts Answers: “Could you confirm whether the geographic location of the web host has any significant ranking factors for organic SEO?”

Tuesday, July 7, 2009

SEO Trends 2009


If you know anything about the fundamentals of search engine optimization ( SEO Tips ), you will know that the experts’ opinions on what to do, what is now considered redundant and recommendations of the absolutely necessary elements you must put in place on your website, change with mind-spinning frequency.

However, the bigger trends in search engine optimisation are easier to keep up with and most seem to be here to stay, as business owners become more attuned to just how important SEO is to their success.

SEO Trends 2009 And Beyond

The industry and ‘bigger picture’ trends have become relatively easy to identify as the SEO world becomes more established - increased awareness of SEO in general being the first major trend.

Search engine optimisation used to be an exclusive world, with only those practising it really knowing what it entailed and how it boosted a website’s visitor stats.

Now, most people know what it means, and most business owners are aware of just how important the internet is and how SEO can help them gain more customers.

The next major trend and one which is definitely here to stay, is the rise and rise of Google - they have dominated the world of search engine optimisation for years, and their lead over competitors such as Yahoo and MSN is only getting bigger.

When Google speaks, SEO consultants listen and if your business is not being found on Google, it’s almost certain it’s not being found at all - around 70% of user searches are done using Google.

SEO tools and automated processes are becoming an important part of the reporting of website results - individuals are focusing on developing these tools for companies to buy, who may be struggling to find or afford a good search engine optimisation agency. Such tools can offer good basic data, but it’s worth remembering that it’s still what you do with the data that’s important.

Dipping into the actual technicalities of good SEO practice, good content has been king for a while and remains so.

As a business owner, if you write, or employ an SEO copywriter to write good quality articles and press releases, this will benefit your search results hugely, encourages links to and from your website - and links are extremely important as the ‘currency’ of good quality SEO.

Finally, a trend clear to all - not just those operating in the search engine optimisation world - is the rise and rise of social networking sites.

Facebook and now Twitter are educating people to communicate in a whole different way, and business owners should be engaging these people in order to develop a whole new audience.

Keep Up With SEO - Your Business Will Benefit

There are many more trends in search engine optimisation, too many to mention here, but one thing remains clear - business owners small and large cannot afford to ignore SEO and must become well-versed in good SEO practice if they want to keep up with the ever-changing world of consumer web behaviour.

Author: Harvey McEwan



Sunday, July 5, 2009

5 Most Common Link Building Mistakes

After getting to know that link popularity is the most important factor that search engines consider when deciding your Pagerank, almost everyone has taken to posting their back links all over the internet. However, gaining credible link popularity that increases your page rank is really not as easy as it sounds and most people who do not know much about how Pagerank works keep creating text links that are either useless, or even harmful for their webpage. In order to make link popularity work for you, it is important to avoid the most common link building mistakes.

  1. Creating Links On Unrelated Pages.
    Search engines nowadays are smart enough to decipher whether your link is relevant to the users of the webpage that they are placed on or not. So if you are putting the text links of your webpages which are related to cooking on websites that deal with high end electronic goods, wrestling or real estate, you can be sure that these links will be trashed by the search engine, doing zilch for your Pagerank. So create links on sites that have related content. For example, if you have a cooking related website, placing links on home improvement, restaurant, recipe and kitchen equipment sites will be helpful.
  2. Creating Text Links That Proclaim,"Click Here".
    Google search engine spiders use the text on your hyperlink to determine what your website is all about and this plays an important role in determining both your Pagerank as well as your search engine rankings. By putting hyperlinks that simply say " click here for more information" or "know more here", you are telling the search engines that your website is about clicking and knowing more and that is where you will be ranked high instead of getting high search engine rankings for your own domain. Make it a point to use relevant keywords in your text links.
  3. Creating Links On High Pagerank Pages That Already Have Hundreds Of Links.
    Yes, it is true that the higher the PR of a page linking to you, the higher will be the value assigned to the link by the search engines and the greater will be your page rank. This is because the value or points that you get from a page that links back to you is determined by the formula — page rank of linking page / number of links on that page. But it is equally important to pay attention to the denominator of the formula, since a highly ranked page with lots of back links may end up giving you lower value than a page with lower rank but lesser links.
  4. Placing Links On Dynamic Pages.
    Web pages that are generated dynamically take ages to get indexed, and in all probability, a text link on such a page will never help in increasing your page rank. Identify dynamic pages by URLs with special symbols in them (&,? etcetera) and save your time by not posting any text links to your site on such pages.
  5. Creating Links On Pages That SE Spiders Cannot Crawl On.
    The whole point of building link popularity for increasing page rank is to make your site accessible to search engine spiders. However, search engine bots have not become smart enough to read pages generated by Flash or JavaScript. They are also unable to read text links placed within frames, so it is a complete waste of time creating back links on sites that are either Flash generated or framed, since these links would be totally ignored by search engine spiders.

Saturday, July 4, 2009

Semantics for SEO



Semantics is the study of meaning and relationships and plays a role in SEO. Google makes its money from being successful at crawling, indexing, and ranking data. One important aspect of this procedure is being able to understand content in a manner that is more complex than keyword density. One way of doing this is Latent Semantics Analysis or LSA. Google may not use this exact approach, but Google is sure to use some similar system of semantic analysis to look at textual content.

Intro to SEO Semantics

By using semantics, search engines can have a basic understanding of the English language. There is an understanding of synonyms, antonyms, and polysemes. In addition, Google can relate niches and keywords. They’re able to develop complex relationships with keywords through their huge database of information and linking relationships.

For example, Google can understand the following relationships for a Make Money Online site.

Make Money Online

  • Make Money
  • Make Money Online
  • Make Money Blogging
  • Make Money on eBay

Internet Marketing

  • Internet Marketing Services
  • Online Marketing

Social Media Marketing

  • Social Media
  • Social Bookmarking
  • Web 2.0
  • Digg
  • StumbleUpon

Blogging

  • Blogger
  • Wordpress
  • Plugins
  • RSS Feed

SEO

  • Search Engine Optimization
  • Link Building
  • Article Marketing
  • Directory Submissions
  • Dofollow
  • Nofollow

Google knows that the usage of any of the above keywords is related to the primary keyword of make money online. It can relate long tail keywords as well as related keywords and the long tails of those related keywords.

Role of Semantics when Writing for Search Engines

Knowing that Google uses semantics is important when writing your site’s content and building links. You should use this knowledge to solidify your page’s focus on a keyword.

Quick Note on Keyword Density

Google content analysis is more advanced than keyword density. Do not stress keyword density. Keyword density will not help you rank higher. Anyone still teaching you to achieve a certain density is wrong. Keyword usage is important, but there is no set density that you should target. Simply use it naturally, but ideas like LSA move Google away from basic keyword density analysis. Keyword density is not a direct measure of relevancy. The usage of a keyword 10 times does not make it more relevant than a page that uses its keyword 3 times. If anything, the over usage of a keyword can hurt your rankings. The usage of semantics can be used to determine if a site is using natural language. An over optimized keyword stuffed page does not use natural language and may hurt your rankings.

Use Long Tails

Instead of trying to use your primary keyword over and over, use long tails. Research your keyword before you write your article to determine long tail keywords. Select two or three long tails that have your main keyword as their parent keyword. Use them in your content to support your main keyword and do it without keyword spamming. Not only does this create natural content, but this increases the number of terms you can rank for. You can now work your primary keyword as well as the long tails.

Related Terms

In addition to long tails, use related keywords. If you’re writing about SEO, talk about internet marketing. If you talk about McDonalds, talk about hamburgers. If you talk about hamburgers, discuss hot dogs. If Apple, discuss Mac and Ipod. These related terms will support your main keyword because Google understands the relationship. In addition, it will increase the number of keywords you can rank for.

Role of Semantics when Link Building

The classic advice of varying your anchor text. If you get too many links with the same anchor text, Google will consider it a Google bomb. Varying anchor text is important for two reasons. First, it looks natural. Having the same anchor text is a sign of self generated links. Second, semantics comes into play. You can use related anchor text to support your primary target keyword.

A post about BANS (Build a Niche Store) could use any of the follow keyword(s) as anchor text and still support the primary keyword.

  • BANS
  • BANS site
  • BANS ebay
  • Build A Niche Store
  • Make Money on eBay
  • Make Money Online
  • eBay Niche Store
  • Build an Online Store
  • eBay Affiliate

Google understands that all of these keywords are related.

Dominating Multiple Long Tails Keywords

By using this approach instead of keyword stuffing one keyword phrase, you’re able to dominate multiple keyword listings for long tails. You can rank for many long tails by simply mentioning the keyword. Instead of repeating one keyword phase over and over, use semantically related keywords. This will reinforce your main keyword while also increasing the number of long tails you can rank for. As your page gains authority for the main keyword, your authority for the semantically related keywords will also increase. And the opposite is also true, as you gain authority for the semantically related keywords, your authority for the main keyword increases.

Conclusion

Understanding that Google uses semantics to evaluate content can help you improve your on site SEO. Simple measures like keyword density are out dated and should not be used to determine how targeted a page is. So when developing content, consider developing a list of long tails and related keywords that you can sprinkle through your content to help support your targeted keyword.




Friday, July 3, 2009

How can use good robots.txt for Search engines

Using robots.txt, you can ban specific robots, ban all robots, or block robot access to specific pages or areas of your site. If you are not sure what to type, look at the bottom of this page for examples.

An example of SEO optimized robots.txt file (should work on most blogs… just edit the sitemap URL):

User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/

User-Agent: Mediapartners-Google
Allow: /

User-Agent: Adsbot-Google
Allow: /

User-Agent: Googlebot-Image
Allow: /

User-Agent: Googlebot-Mobile
Allow: /


When robots (like the Googlebot) crawl your site, they begin by requesting

Robots.txt Samples

Following are a few simple examples of what you might type in your robots.txt file. For more examples, read the robots.txt specification. (In the specification, look for the “What to put into the robots.txt file” heading.) Please note the following points:

Important: Search engines look only in top-level domains for robots.txt files. So this plugin will only help you if typing in http://blog.example.com/ or http://example.com brings up Wordpress. If you have to type http://example.com/blog/ to bring up Wordpress (i.e. it is in a subdirectory, not in a subdomain or at the domain root), this plugin will not do you any good. Search engines look do not look for robots.txt files in subdirectories, only in root domains and subdomains.

Following are a few examples of what you can type in a robots.txt file.

Ban all robots

User-agent: *
Disallow: /

Allow all robots

To allow any robot to access your entire site, you can simply leave the robots.txt file blank, or you could use this:

User-agent: *
Disallow:

Ban specific robots

To ban specific robots, use the robot’s name. Look at the list of robot names to find the correct name. For example, Google is Googlebot and Microsoft search is MSNBot. To ban only Google:

User-agent: Googlebot
Disallow: /

Allow specific robots

As in the previous example, use the robot’s correct name. To allow only Google, use all four lines:

User-agent: Googlebot
Disallow:

User-agent: *
Disallow: /

Ban robots from part of your site

To ban all robots from the page “Archives” and its subpages, located at http://yourblog.example.com/archives/,

User-agent: *
Disallow: /archives/


robots, robots.txt

Web Optimization – Four Common Problems That Stop Your Success

Anyone interested in online marketing knows that web optimization is critical to a successful business. Web optimization comprises a number of different ideas, including search engine optimization, website analytics, and design factors, among many others.

However, optimization is more than just a standard set of practices. As every good interactive marketing agency knows, it is different for each business, and within each industry.

Those differences are one of the primary aspects that make ‘do-it-yourself’ optimization without an interactive marketing agency such a risky prospect. An interactive marketing agency keeps abreast of the ever-changing landscape in order to implement best practices to achieve good positioning and visibility for a website — they are also able to conduct in-depth research to understand what your competition is doing as well.

If you are learning from scratch and implementing as you go, you can be put at a disadvantage compared to competitors who hire professionals.

In this article, we’ll walk through some of the most common misconceptions about optimization. We’ll also look at what your company can do to see real optimization success.

Problem 1: Seeing Optimization as a Project With An “End Date”

Optimization, and online marketing in general, isn’t a destination. Rather, it’s a road, one that must be constantly traveled for optimal levels of success. There is no time when your optimization is “complete”, in fact, even once your initial online marketing plan sees success, there will be other ways that you can improve your online presence. The process can always be improved.

Problem 2: Not Planning For Optimization In The Long Run

Because online marketing is a process, wise companies will plan for optimization in the long run. Don’t think of it as a short-term investment, and don’t divert resources you are only comfortable diverting for a few weeks. Think about it more broadly, and give your optimization plan the time and support it needs to be successful. Like any company initiative, if the program is understaffed or underfunded, it won’t be able to thrive as it ought to.

Problem 3: Not Monitoring Progress

In the old days, it was next to impossible to know if your agency’s plan was doing the job. But now, tracking online marketing results are easy. Think of it like cooking: you have to test the food every so often to see how it’s going. If you need to make a change, you learn about it early on, and if the food is great, you know more about how to make it the next time around.

Web optimization is exactly the same way. Keeping track of what policies bring success and which don’t will help you in the short term and in the long term. You will have more to work with when you start additional campaigns, and you’ll have real results that you can point to. So much depends on customer preference, and only when you start to get a feel for that preference will you see the best outcomes.

Problem 4: Working Alone

It is the rare person who can successfully design and implement an online marketing optimization strategy without the help of an interactive marketing agency. Optimization is a very particular process, with a number of techniques and strategies to learn. Articles like this one can help, but it takes years of experience to become a real optimization expert.

Does it really make sense for you to spend your time learning, rather than hiring the expertise of an interactive marketing agency? In almost every case, focusing on what you do best – running your business – is the best idea.

Putting it All Together

Now you know some of the most common pitfalls that make optimization programs fail. Do any of them sound familiar? If so, then you’re now equipped with the knowledge to change the problem. You can start fresh, and get the optimization results you’re looking for. You might not see them overnight, but with time the effect will be noticeable.

Thursday, July 2, 2009

How to get your lens into the top 100 & other great tips

Here are some tips on what I did:

* Be sure to have at least four modules (so your lens gets featured on Squidoo)

* Add as many tags as you can (you can use 40) about your topic so it is indexed higher when searched--Try to use phrases that people will actually enter when searching

* Post comments to other lenses so they will hopefully return the favor & visit your lens to comment, rank, and favorite. I also post messages in groups & forums online. I use the forums; Yahoogroups & Cafemom regularly.

* Search other lenses about tips to making a good lens.

* Add Widgets to get more traffic. Lensmaster, thefluffanutta, has created a number of widgets you can easily add to your lenses. Check out his Love this lens? lens to get them. Love ya Fluffa...you create awesome stuff!!!

* Search for lenses to make money with Squidoo.

* Search other lenses on your topic to see how they did theirs. You don't want to make one that is already covering your topic...or you will want to make it differently...

* Join Squidoo groups for even more exposure for your lenses.

* Create back links to your lens by posting messages with your full lens address to blogs, social networking sites, forums & groups. Be sure to have your lens URL in email, blog & forum signatures lines too.

* Spread the word about any new content to your lens. Use a Squidcast, Twitter & Facebook first. The lens I was promoting is for Baby & Kids Freebies so it was easy to send out messages (posts) to my groups & blogs about any new freebie finds. I really concentrate on making sure the freebies are legitimate and I think that helps get return visitors to my lens & website because the freebies are real not a bunch of surveys & trial offers you have to fill out to get the freebie.

* Social bookmark your lens on sites like tagfoot, digg, StumbleUpon, del.icio.us, Facebook & Twitter. Make sure your Squidoo bio has set up the Twitter setting so you can send updates to Twitter with a single click!

* Be sure to "Ping" your lens. Do this every time you have a significant update to your site. Pinging your lens sends out a notice to search engines that your site has been updated. This helps your lens, blog, or site so it may be noticed sooner by google, yahoo, msn and the other search sites when they crawl the web. You can Ping your lens easily at SquidUtils.com after logging in and going to the "advanced dashboard".

* When creating a lens address use something that is a short phrase or words that would be searched on your topic and use an underscore or dash between the words. Ex: Use special_education_tips or special-education-tips not specialeducationtips.

* Let all of your friends know you have published a lens and ask them to visit, rate, comment, join and favorite. Friends will help you like that. :-)

* Lensroll lenses to yours that are appropriate and send a note or comment to that lens owner and ask them to do the same with yours.

* Add the google blog or news search to your lens (usually at the bottom). A lot of lensmasters use this module. I like it too. I usually mention how often it is updated too. This is a easy quick way to keep your lens updated without doing a thing. When search engines crawl for updating sites yours will be one of those. I usually set mine to update 1 x per day.

* Include a comment module on your lens. People like to comment. And, people like to read what others have to say. It is a useful module. It really bugs me when I have something on my mind to say and there is NO comment area. Plus again this keeps your lens updated (in the eyes of search engines).

* Include a links plexo for visitors to add their lens, blog or website. Great way to give others a way to get "Back Links".

* Look through the module selection list and play around with what is available. There is lots of great modules and more being created.

* Search Squidoo for "How to Monetize your website or Blog" & "How to get more traffic". Great tips can be found.

* Search Squidoo for lens that give tips to improve your lens--there are a lot of helpful lenses already made. (I am going to make a lens that highlights the lenses that I have found most beneficial to me--I just do not have it done yet) Here some places I found helpful ~ The Squidoo Answer Deck and for sure the best resource.... Squidutils.com

* Be sure to go to "my dashboard" and find near your picture "edit bio" and be sure Allow Contact is set to yes so others can contact you through Squidoo. This has been a great help to me. It is very frustrating to want to contact a lensmaster and this feature is turned off.

* Save all of your hard work! Back-up your lens. To do this you must be in the "edit" feature of your lens. Look along the right column of tasks under tags & lens settings you will see "Export". Click that to save (back-up) your lens. I recommend saving it to a specific folder in your hard drive in html format.

Wednesday, July 1, 2009

Site Architecture and SEO – file/page issues

Source: Bing.com: Search engine optimization (SEO) has three fundamental pillars upon which successful optimization campaigns are run. Like a three-legged stool, take one away, and the whole thing fails to work. The SEO pillars include: content (which we initially discussed in Are you content with your content?), links (which we covered in Links: the good, the bad, and the ugly, Part 1 and Part 2), and last but not least, site architecture. You can have great content and a plethora of high quality inbound links from authority sites, but if your site’s structure is flawed or broken, then it will still not achieve the optimal page rank you desire from search engines.

The search engine web crawler (also known as a robot or, more simply, a bot) is the key to website architecture issues (Bing uses MSNBot). Think of the bot as a headless web browser, one that does not display what it sees, but instead interprets the HTML code it finds on a webpage and sends the content it discovers back to the search engine database so that it can be analyzed and indexed. You can even equate the bot to a very simple user. If you target your site’s content to be readable by that simple user (serving as a lowest common denominator), then more sophisticated users (running browsers like Internet Explorer 8 or Firefox 3) will most certainly keep up. Using that analogy, doing SEO for the bot is very much a usability effort.

If you care about your website being found in search (and I presume you do if you’re reading this column!), you’ll want to help the crawler do its job. Or at the very minimum, you should remove any obstacles under your control that can get in its way. The more efficiently the search engine bot crawls your site, the higher the likelihood that more of its content that will end up in the index. And that, my friend, is how you show up in the search engine results pages (SERPs).

With site architecture issues for SEO, there’s a ton of material to cover. So much so, in fact, that I need to break up this subject into a multi-part series of blog posts. I’ve broken them down into subsets of issues that pertain to: HTML files (pages), URLs and links, and on-page content. I even plan a special post devoted solely to tag optimizations for SEO.

So let’s kick off this multi-part series of posts with a look at SEO site architecture issues and solutions related to files and pages.

Use descriptive file and directory names

Every time you can use descriptive text to help represent your content, the better off your site will be. This even goes for file and directory names. Besides being far more user friendly for end users to remember, the strategic use of keywords in file and directory names will further reinforce their relevance to those pages.

And while you’re examining the names of files and directories, avoid using underscores as word separators. Use hyphens instead. This syntax will help the bot to properly parse the long name you use into individual words instead of having it treated as the equivalent of a meaningless superlongkeyword.

Limit directory depth

Bots don’t crawl endlessly, searching every possible nook and cranny of every website (unless you are an important authority site, where it may probe deeper than usual). For the rest of us, though, creating a deep directory structure will likely mean the bot never gets to your deepest content. To alleviate this possibility, make your site’s directory structure shallow, no deeper than four child directories from the root.

Limit physical page file size

Keep your individual webpage files down under 150 KB each. Anything bigger than that and the bot may abandon the page after a partial crawl or skip crawling the page entirely.

Externalize on-page JavaScript and CSS code

If your pages use JavaScript and or Cascading Style Sheets (CSS), make sure that content is not inline within the HTML page. Search bots want to see the tag content as quickly as possible. If your pages are filled with script and CSS code, you run the risk of making the pages too long to be effectively crawled. In fact, ensure that the tag starts within the first 100 KB of the page’s source code; otherwise, the bot may not crawl the page at all.

Removing JavaScript and CSS code from your pages into external files offers additional advantages beyond just shortening your webpage files. By being external to the content they modify, they can be used by multiple pages simultaneously. Externalizing this content also simplifies code maintenance issues.

Follow these examples on how to reference external JavaScript and CSS code in your HTML pages.

A few notes to consider. External file references are not supported in really old browser versions, such as Netscape Navigator 2.x and Microsoft Internet Explorer 3.x. But if the users of such old browsers are not your target audience, the benefits of externalizing this code will far outweigh that potential audience loss. I also recommend storing your external code files separately from your HTML code, such as in /Scripts and /CSS directories. This helps keep website elements organized, and you can then easily use your robots.txt file to block bot access to all of your code files (after all, sometimes scripts handle business confidential data, so preventing the indexing of those files might be a wise idea!).

Use 301 redirects for moved pages

When you move your site to a new domain or change folder and/or file names within your site, don’t lose all of your previously earned site ranking “link juice.” Search engines are quite literal in that the same pages on different domains or the same content using different file names are regarded as duplicates. Search engines also attribute rank to pages. Search engines have no way of knowing when you intend new page URLs to be considered updates of your old page URLs. So what do you do? Use an automatic redirect to manage this for you.

Automatic redirects are set up on your web server. If you don’t have direct access to your web server, ask your administrators to set this up for you. Otherwise, you’ll need to do a bit of research. First you need to know which type of HTML redirect code you need. Unless your move is very temporary (in which case you’ll want to use a 302 redirect), use a 301 redirect for permanently moved pages. A 301 tells the search engine that the page has moved to a new location and that the new page is not a duplicate of the old page, but instead IS the old page at a new location. Thus when the bot attempts to crawl your old page location, it’ll be redirected to the new location, gather the new page’s content , and apply any and all changes made to the existing page rank standing.

To learn how to set this up, you’ll first need to know which web server software is running your site. Once you know that, click either Windows Server Internet Information Server (IIS) or Apache HTTP Server to learn how you can set up 301 redirects on your website.

Avoid JavaScript or meta refresh redirects

Technically you can also do page redirects with JavaScript or “refresh” tags. However, these are not recommended methods of accomplishing this task and still achieving optimal SEO results. These methods were highly abused in the past for hijacking users away from content that they wanted to web spam that they didn’t want. As a result, search engines take a dim view of these techniques for redirect. To do the job right, to preserve your link juice, and to continue your good standing with search engines, use 301 redirects instead.

Implement custom 404 pages

When a user makes a mistake then typing your URL into the address bar of their browser or an inbound link contains a typo, the typical website pops up a generic HTML 404 File Not Found error page. The most common end user response to that error message is to abandon that webpage. If that user had gone to your website and despite the error, you actually had the information they were seeking, that’s a lost business opportunity.

Instead of letting users go away thinking your site is broken, make an attempt to help them find what they want by showing a custom 404 page. Your page should look like the other page designs on your site, include an acknowledgment that the page the user was looking for doesn’t exist, and offer a link to your site’s home page and more importantly, access to either a site-wide search or an HTML-based sitemap page. At a minimum, make sure your site’s navigation tools are present, enabling the user to search for their content of interest before they leave.

Implementing a custom 404 page is dependent upon which web server you are using: For users of Windows Server IIS, check out the new Bing Web Page Error Toolkit. Otherwise, browse the 404 information for Apache HTTP Server.
Other crawler traps

The search engine bot doesn’t see the Web as do you and I. As such, there are several other page-related issues that can “trap” the bot, preventing it from seeing all of the content you intend to have indexed. For example, there are many page types that the bot doesn’t handle very well. If you use frames on your website (does anyone still use frames?), the bot will only see the frame page elements as individual pages. Thus, when it want to see how each page interrelates to other pages on your site, frame element pages are usually poor performers. This is because frame pages usually separate content from navigation. Thus content pages often become islands of isolated text that are not linked to directly by anything. And with no links to them, they might never get found. But even if the bot finds the frame’s navigation pane page, there’s no context to the links. This is pretty bad in terms of ranking in search engine relevance.

Other types of pages that can trip up search engine bots include forms (there’s typically no useful content on a form page) and authentication pages (bots can’t execute authentication schemes, so they are blocked from seeing all of the pages behind the authentication gateway). Pages that require either session IDs or cookies to be accessed are similar to authentication pages in that the bot’s inability to generate session IDs or accept cookies block them from accessing content requiring such tracking measures.

To keep the search engine bot from going places that might trip it up, see the following information about the “noindex” attribute to prevent indexing of whole pages.

We’re only getting started here on site architecture issues. There’s plenty more to come. If you have any questions, comments, or suggestions, feel free to post them in our SEM forum. See you soon…

– Rick DeJarnette, Bing Webmaster Center