Thursday, July 2, 2009

How to get your lens into the top 100 & other great tips

Here are some tips on what I did:

* Be sure to have at least four modules (so your lens gets featured on Squidoo)

* Add as many tags as you can (you can use 40) about your topic so it is indexed higher when searched--Try to use phrases that people will actually enter when searching

* Post comments to other lenses so they will hopefully return the favor & visit your lens to comment, rank, and favorite. I also post messages in groups & forums online. I use the forums; Yahoogroups & Cafemom regularly.

* Search other lenses about tips to making a good lens.

* Add Widgets to get more traffic. Lensmaster, thefluffanutta, has created a number of widgets you can easily add to your lenses. Check out his Love this lens? lens to get them. Love ya Fluffa...you create awesome stuff!!!

* Search for lenses to make money with Squidoo.

* Search other lenses on your topic to see how they did theirs. You don't want to make one that is already covering your topic...or you will want to make it differently...

* Join Squidoo groups for even more exposure for your lenses.

* Create back links to your lens by posting messages with your full lens address to blogs, social networking sites, forums & groups. Be sure to have your lens URL in email, blog & forum signatures lines too.

* Spread the word about any new content to your lens. Use a Squidcast, Twitter & Facebook first. The lens I was promoting is for Baby & Kids Freebies so it was easy to send out messages (posts) to my groups & blogs about any new freebie finds. I really concentrate on making sure the freebies are legitimate and I think that helps get return visitors to my lens & website because the freebies are real not a bunch of surveys & trial offers you have to fill out to get the freebie.

* Social bookmark your lens on sites like tagfoot, digg, StumbleUpon, del.icio.us, Facebook & Twitter. Make sure your Squidoo bio has set up the Twitter setting so you can send updates to Twitter with a single click!

* Be sure to "Ping" your lens. Do this every time you have a significant update to your site. Pinging your lens sends out a notice to search engines that your site has been updated. This helps your lens, blog, or site so it may be noticed sooner by google, yahoo, msn and the other search sites when they crawl the web. You can Ping your lens easily at SquidUtils.com after logging in and going to the "advanced dashboard".

* When creating a lens address use something that is a short phrase or words that would be searched on your topic and use an underscore or dash between the words. Ex: Use special_education_tips or special-education-tips not specialeducationtips.

* Let all of your friends know you have published a lens and ask them to visit, rate, comment, join and favorite. Friends will help you like that. :-)

* Lensroll lenses to yours that are appropriate and send a note or comment to that lens owner and ask them to do the same with yours.

* Add the google blog or news search to your lens (usually at the bottom). A lot of lensmasters use this module. I like it too. I usually mention how often it is updated too. This is a easy quick way to keep your lens updated without doing a thing. When search engines crawl for updating sites yours will be one of those. I usually set mine to update 1 x per day.

* Include a comment module on your lens. People like to comment. And, people like to read what others have to say. It is a useful module. It really bugs me when I have something on my mind to say and there is NO comment area. Plus again this keeps your lens updated (in the eyes of search engines).

* Include a links plexo for visitors to add their lens, blog or website. Great way to give others a way to get "Back Links".

* Look through the module selection list and play around with what is available. There is lots of great modules and more being created.

* Search Squidoo for "How to Monetize your website or Blog" & "How to get more traffic". Great tips can be found.

* Search Squidoo for lens that give tips to improve your lens--there are a lot of helpful lenses already made. (I am going to make a lens that highlights the lenses that I have found most beneficial to me--I just do not have it done yet) Here some places I found helpful ~ The Squidoo Answer Deck and for sure the best resource.... Squidutils.com

* Be sure to go to "my dashboard" and find near your picture "edit bio" and be sure Allow Contact is set to yes so others can contact you through Squidoo. This has been a great help to me. It is very frustrating to want to contact a lensmaster and this feature is turned off.

* Save all of your hard work! Back-up your lens. To do this you must be in the "edit" feature of your lens. Look along the right column of tasks under tags & lens settings you will see "Export". Click that to save (back-up) your lens. I recommend saving it to a specific folder in your hard drive in html format.

Wednesday, July 1, 2009

Site Architecture and SEO – file/page issues

Source: Bing.com: Search engine optimization (SEO) has three fundamental pillars upon which successful optimization campaigns are run. Like a three-legged stool, take one away, and the whole thing fails to work. The SEO pillars include: content (which we initially discussed in Are you content with your content?), links (which we covered in Links: the good, the bad, and the ugly, Part 1 and Part 2), and last but not least, site architecture. You can have great content and a plethora of high quality inbound links from authority sites, but if your site’s structure is flawed or broken, then it will still not achieve the optimal page rank you desire from search engines.

The search engine web crawler (also known as a robot or, more simply, a bot) is the key to website architecture issues (Bing uses MSNBot). Think of the bot as a headless web browser, one that does not display what it sees, but instead interprets the HTML code it finds on a webpage and sends the content it discovers back to the search engine database so that it can be analyzed and indexed. You can even equate the bot to a very simple user. If you target your site’s content to be readable by that simple user (serving as a lowest common denominator), then more sophisticated users (running browsers like Internet Explorer 8 or Firefox 3) will most certainly keep up. Using that analogy, doing SEO for the bot is very much a usability effort.

If you care about your website being found in search (and I presume you do if you’re reading this column!), you’ll want to help the crawler do its job. Or at the very minimum, you should remove any obstacles under your control that can get in its way. The more efficiently the search engine bot crawls your site, the higher the likelihood that more of its content that will end up in the index. And that, my friend, is how you show up in the search engine results pages (SERPs).

With site architecture issues for SEO, there’s a ton of material to cover. So much so, in fact, that I need to break up this subject into a multi-part series of blog posts. I’ve broken them down into subsets of issues that pertain to: HTML files (pages), URLs and links, and on-page content. I even plan a special post devoted solely to tag optimizations for SEO.

So let’s kick off this multi-part series of posts with a look at SEO site architecture issues and solutions related to files and pages.

Use descriptive file and directory names

Every time you can use descriptive text to help represent your content, the better off your site will be. This even goes for file and directory names. Besides being far more user friendly for end users to remember, the strategic use of keywords in file and directory names will further reinforce their relevance to those pages.

And while you’re examining the names of files and directories, avoid using underscores as word separators. Use hyphens instead. This syntax will help the bot to properly parse the long name you use into individual words instead of having it treated as the equivalent of a meaningless superlongkeyword.

Limit directory depth

Bots don’t crawl endlessly, searching every possible nook and cranny of every website (unless you are an important authority site, where it may probe deeper than usual). For the rest of us, though, creating a deep directory structure will likely mean the bot never gets to your deepest content. To alleviate this possibility, make your site’s directory structure shallow, no deeper than four child directories from the root.

Limit physical page file size

Keep your individual webpage files down under 150 KB each. Anything bigger than that and the bot may abandon the page after a partial crawl or skip crawling the page entirely.

Externalize on-page JavaScript and CSS code

If your pages use JavaScript and or Cascading Style Sheets (CSS), make sure that content is not inline within the HTML page. Search bots want to see the tag content as quickly as possible. If your pages are filled with script and CSS code, you run the risk of making the pages too long to be effectively crawled. In fact, ensure that the tag starts within the first 100 KB of the page’s source code; otherwise, the bot may not crawl the page at all.

Removing JavaScript and CSS code from your pages into external files offers additional advantages beyond just shortening your webpage files. By being external to the content they modify, they can be used by multiple pages simultaneously. Externalizing this content also simplifies code maintenance issues.

Follow these examples on how to reference external JavaScript and CSS code in your HTML pages.

A few notes to consider. External file references are not supported in really old browser versions, such as Netscape Navigator 2.x and Microsoft Internet Explorer 3.x. But if the users of such old browsers are not your target audience, the benefits of externalizing this code will far outweigh that potential audience loss. I also recommend storing your external code files separately from your HTML code, such as in /Scripts and /CSS directories. This helps keep website elements organized, and you can then easily use your robots.txt file to block bot access to all of your code files (after all, sometimes scripts handle business confidential data, so preventing the indexing of those files might be a wise idea!).

Use 301 redirects for moved pages

When you move your site to a new domain or change folder and/or file names within your site, don’t lose all of your previously earned site ranking “link juice.” Search engines are quite literal in that the same pages on different domains or the same content using different file names are regarded as duplicates. Search engines also attribute rank to pages. Search engines have no way of knowing when you intend new page URLs to be considered updates of your old page URLs. So what do you do? Use an automatic redirect to manage this for you.

Automatic redirects are set up on your web server. If you don’t have direct access to your web server, ask your administrators to set this up for you. Otherwise, you’ll need to do a bit of research. First you need to know which type of HTML redirect code you need. Unless your move is very temporary (in which case you’ll want to use a 302 redirect), use a 301 redirect for permanently moved pages. A 301 tells the search engine that the page has moved to a new location and that the new page is not a duplicate of the old page, but instead IS the old page at a new location. Thus when the bot attempts to crawl your old page location, it’ll be redirected to the new location, gather the new page’s content , and apply any and all changes made to the existing page rank standing.

To learn how to set this up, you’ll first need to know which web server software is running your site. Once you know that, click either Windows Server Internet Information Server (IIS) or Apache HTTP Server to learn how you can set up 301 redirects on your website.

Avoid JavaScript or meta refresh redirects

Technically you can also do page redirects with JavaScript or “refresh” tags. However, these are not recommended methods of accomplishing this task and still achieving optimal SEO results. These methods were highly abused in the past for hijacking users away from content that they wanted to web spam that they didn’t want. As a result, search engines take a dim view of these techniques for redirect. To do the job right, to preserve your link juice, and to continue your good standing with search engines, use 301 redirects instead.

Implement custom 404 pages

When a user makes a mistake then typing your URL into the address bar of their browser or an inbound link contains a typo, the typical website pops up a generic HTML 404 File Not Found error page. The most common end user response to that error message is to abandon that webpage. If that user had gone to your website and despite the error, you actually had the information they were seeking, that’s a lost business opportunity.

Instead of letting users go away thinking your site is broken, make an attempt to help them find what they want by showing a custom 404 page. Your page should look like the other page designs on your site, include an acknowledgment that the page the user was looking for doesn’t exist, and offer a link to your site’s home page and more importantly, access to either a site-wide search or an HTML-based sitemap page. At a minimum, make sure your site’s navigation tools are present, enabling the user to search for their content of interest before they leave.

Implementing a custom 404 page is dependent upon which web server you are using: For users of Windows Server IIS, check out the new Bing Web Page Error Toolkit. Otherwise, browse the 404 information for Apache HTTP Server.
Other crawler traps

The search engine bot doesn’t see the Web as do you and I. As such, there are several other page-related issues that can “trap” the bot, preventing it from seeing all of the content you intend to have indexed. For example, there are many page types that the bot doesn’t handle very well. If you use frames on your website (does anyone still use frames?), the bot will only see the frame page elements as individual pages. Thus, when it want to see how each page interrelates to other pages on your site, frame element pages are usually poor performers. This is because frame pages usually separate content from navigation. Thus content pages often become islands of isolated text that are not linked to directly by anything. And with no links to them, they might never get found. But even if the bot finds the frame’s navigation pane page, there’s no context to the links. This is pretty bad in terms of ranking in search engine relevance.

Other types of pages that can trip up search engine bots include forms (there’s typically no useful content on a form page) and authentication pages (bots can’t execute authentication schemes, so they are blocked from seeing all of the pages behind the authentication gateway). Pages that require either session IDs or cookies to be accessed are similar to authentication pages in that the bot’s inability to generate session IDs or accept cookies block them from accessing content requiring such tracking measures.

To keep the search engine bot from going places that might trip it up, see the following information about the “noindex” attribute to prevent indexing of whole pages.

We’re only getting started here on site architecture issues. There’s plenty more to come. If you have any questions, comments, or suggestions, feel free to post them in our SEM forum. See you soon…

– Rick DeJarnette, Bing Webmaster Center

Monday, June 29, 2009

15 Nifty SEO Google Alert Tips




You may know that you can get the latest news headline links using Google alerts.
Simply go to http://www.news.google.com and put in a search for something you want to know more about.

For instance, I may want to get updates on news about "search engine marketing". After you get the results on that page, drag down to the bottom. In the middle you will see
New! Get the latest news on search engine marketing with Google Alerts. Click the link to got to the Alerts page.

On the Alerts page you can tell Google how often you want to receive the alerts (I always choose "once a day" and to which email account you want to receive the alerts (some people have many email accounts to choose from). Then hit the "Create Alert" button and you will start receiving the alerts for the term you searched. Easy enough unless you are lazy like me. See, I never thought to investigate the "Type" of search result I was looking for so I was getting just news. I could also have been getting blog, web, video and groups alerts, You also have the option to receive "Comprehensive" alerts. Now I select that option. You can subscribe to alerts in multiple languages.

You can receive up to 1,000 alerts. Woot!

Here are some ideas about how you can use Google Alerts.

1. monitor your competitors - new products, ideas, financial changes - competitive intelligence
2. monitor your customers and prospects - It would be nice to send them a card when they do something newsworthy
3. track your name and your business name - put quotes around the phrases like "Joe Jones" or "Pete's Pies" - what are people saying about you or your company in the blogs?
4. in the "Advanced Search" page you can narrow you search by geographical location, date and other parameters.
5.
Track news about new software releases or version upgrades
6. local news - track the subject and the newspaper

7. Want to know when someone links to your website or blog? Search
link:myblogname.com
8. authors - get ideas for a new article

9. niches - more ideas and what is happening in your niche
10. job seekers - think of the many ways to use this to learn more about the job market

11. when is a new page from you blog included in Google? type in a unique line from your article

12. cache- what a page looked like earlier
cache:sitename.com
13. site: get results from just one website

14. related: what does Google think is related to the site - related:www.
sitename.com
15. inurl: search for the page URLs - inurl:seo
Leave your good Google alert tips in the comments.

Bing Webmaster Tools

Source: bing.com: Use the Webmaster Tools to troubleshoot the crawling and indexing of your site, submit sitemaps and view statistics about your sites. Get data on how many pages of your site have been indexed, backlinks, inbound links and keyword performance.

To submit your site to Bing:

To request that Bing crawl your site, submit your site’s domain to http://www.bing.com/docs/submit.aspx.

To submit your Sitemap:

To submit an XML-based Sitemap for your site, copy and paste the below URL into the address bar of your browser–be sure to change “www.YourWebAddress.com” to your domain name–and then press ENTER:

http://www.bing.com/webmaster/ping.aspx?sitemap=www.YourWebAddress.com/sitemap.xml

Get the SEO Toolkit: The IIS Search Engine Optimization Toolkit helps improve a Web site’s relevance in search results

More>>http://www.bing.com/toolbox/webmasters/default.aspx

New tools for webmasters in the Bing Toolbox

Today we’re really excited to announce the arrival of the Bing Toolbox, a new portal for all you Bing webmasters, publishers, developers, and advertisers out there. The Toolbox is an organized set of tools for the entire Bing community, plus links to our Webmaster and Developer community blogs and forums.

Thursday, June 25, 2009

Matt Cutts Answers Questions About Directories and Ranking

As you may know, Google’s Matt Cutts frequently answers questions from Google users on the Google Webmaster Central YouTube channel. There are a couple recent ones in which he addresses questions about directories and how they contribute to a site’s rankings.

The first question is:

Will Google consider Yahoo! Directory and BOTW (Best of the Web) as sources of paid links? If no, why is this different from another site that sells links?


When Google looks at whether or not a directory is useful to users, Google looks at:

- What is the value-add?

- Do they go out and find entries on their own or do they only wait for people to come to them?

- How much do they charge?

- What is the editorial service that’s being charged?

“If a directory takes $50 and every single person who ever applies in the directory automatically gets in for that $50, there’s not as much editorial oversight as something like the Yahoo! Directory, where people do get rejected,” says Cutts. “So if there is no editorial value-add there, then that is much closer to paid links.”

The second question is:

We sell a software product, and there are 100s of software download directories on the web of varying quality. Could submitting our product to all of them hurt our rankings or domain trust/authority?

Infosys co-chair Nilekani quits to join India govt

Source: NEW DELHI, June 25 (Reuters) – Nandan Nilekani, co-chairman of Infosys Technologies Ltd (INFY.BO), India’s No. 2 outsourcer, has resigned from the company’s board to join the government, the company said on Thursday.

Nilekani, one of the founders of Infosys, has been invited by Prime Minister Manmohan Singh to head government agency Unique Identification Authority of India in the rank of a cabinet minister, Infosys said in a statement.

Nilekani, a former chief executive of the company, was not involved in active management since becoming co-chairman in 2007.

Shares in Infosys, which has a market value of about $21 billion, were up 0.7 percent at 1,771.25 rupees at 0849 GMT in a Mumbai market down 0.5 percent. (Reporting by Devidutta Tripathy; Editing by Ranjit Gangadharan)
© Thomson Reuters 2009 All rights reserved

Weiner Becomes LinkedIn CEO


LinkedIn has named Jeff Weiner(a former Yahoo! executive) as its CEO, replacing Reid Hoffman, who wil remain on as founder and executive chairman. Jeff Weiner has updated his profile from President to CEO of LinkedIn.


Bing White Paper for Webmasters

The Microsoft Bing Webmaster Team has just released a new white paper titled, Bing: New Features Relevant to Webmasters.

Microsoft released white paper documentation for webmasters about its new Bing search engine, explaining how Bing affects search-engine optimization (SEO).



This white paper discusses the key new features of the Bing search engine results page (SERP) presentation that are relevant to webmasters and web publishers. It also discusses search engine optimization (SEO) considerations needed for Bing.

The following files are available for download from the Microsoft Download Center:

Bing for Webmasters – New Features Relevant to Webmasters white paper
This white paper is an overview of the key features in Bing of interest to webmasters.

http://www.microsoft.com/downloads/details.aspx?FamilyID=b93cfee4-7dfb-40ae-a405-dfa269a33a18&displayLang=en
Download PDF -> Bing–NewFeaturesForWebmasters.pdf
Download XPS -> Bing–NewFeaturesForWebmasters.xps
As an owner of a website or a publisher of content on the Web, you are no doubt interested in the big changes Microsoft has implemented with the newly released version of its search engine, dubbed Bing. There are a great number of innovative changes in how searchers access search content in the Bing index, and as a result, on your website. This white paper was written for you, the folks whose content populates the Bing index. You’ll want to know what Bing is, what changes in Bing pertain to you, and how to make the most of those changes so you get more eyeballs on your website’s content. Let’s get right to it.

Google Profile | Create your Profile in Google

Social community sites are very famous and popular for making lots of fans and sharing your thoughts, ideas, suggestions, stories, blogs, sites, some products which you sell on internet. And especially we use our common profile in some most popular and famous social sites, like Digg, Facebook, Linkedin and MySpace.




Google today promote on page to create profile in Google.com with below text, as google always mention some new updates from his side on main page of Google so all visitor can take a serious not about it. Also we can make static url of our profile on Google.com

Hidden Content Sources for Your Website

As I travel through the Google search engine, there is one element that defines almost all of the top-ranking websites. It's great content. People come online for information and those that offer the best content reap the greatest rewards.

Unfortunately, this type of content is hard to come by. In most cases, you either have to spend hours in front of the keyboard or outsource the job to others. Both of these options are very costly. One requires your valuable time and the other requires an investment of around $10 - $20 per article.

That's why I have scoured the net in search of valuable free content sources. I'm still not quite sure why I'm revealing my treasured piles of free content, but I certainly hope you enjoy them.

One of my favorite sources of content is public domain. This comprises the body of knowledge without a copyright. Anyone can use this material for commercial or non-commercial purposes. Below are some excellent sources for public domain material.

Archive.org

At Archive.org you can find thousands of works that are currently in the public domain. Want to put some cartoons on your web site? Check out the "Film Chest Vintage Cartoons", which is full of classic animated cartoons from the 1930's and 1940's. The collection includes Popeye, Porky Pig, Bugs Bunny, Woody Woodpecker, The Three Stooges and Betty Boop. They also provide tons of other reproducable content including:

Brick Films: Commonly called "LEGO Movies". Brick films are dedicated to the art of stop motion animation.

SabuCat Movie Trailers: The world's largest collection of theatrical trailers.

Feature Films: A large number of classic feature films and shorts.

Universal Newsreels: Newsreels were shown before every feature film in the pre-tv era.

Computer Chronicles: Was the world's most popular television program on personal technology during the height of the computer revolution.

Net Cafe: Television series covering the revolution during the height of the dot com boom.

All of these content sources are available for you to put on your website. check them out at Archive.org.

Another popular public domain destination is Wikipedia.org. Here you will find over 1 million articles ranging from Greek mythology and Egyptian history to business, health, and technology.

Go to Wikipedia.org for a huge collection of articles you can reprint on your own website.

Creative Commons

Every creative work receives copyright protetion as soon as you put pen to paper, hit save, or press record. Because of this, no one can use that work without express permission from the author.

Creative Commons provides a new content license that allows you to share your work with others. If you want, you can even allow other people to expand upon your existing work. This allows for creative co-authorship.

The Creative Commons license has made piles of content available for use on your web site. Whether you are looking for audio, images, video, or text, you can find an abundance of reusable information within the creative commons.

To search for content to put on your own web site, go to CreativeCommons.org.

Government Web Sites

Works produced by the U.S. Federal government are not copyrighted. If you obtain a government document from the net, you are free to copy and distribute the document. I have found plenty of great content about finance, retirement, health, business, and traveling on government websites.

To search for content offered by the United States government, go to http://www.google.com/unclesam.

Article Directories

There are thousands of writers on the internet and many of them would love for you to reprint their articles on your website. You can find thousands of free web articles at the following article directories.

GoArticles.com
EzineArticles.com
ArticleCity.com

Interviews

I consider interviews to be one of the best sources of quality content for your site. Simply interview industry professionals and post the recording and transcript on your website. This allows you to create original content very quickly.

Don't be afraid to ask for an interview, most experts would be delighted to speak with you. Remember, this is probably one of their greatest passions. If you ask them politely, your chances for landing an interview are good.

You can conduct an interview in person, over the phone, or even through an e-mailed questionaire.

RSS Feeds

RSS is changing the way we consume information online. In addition, it has also provided thousands of new content sources for the online publisher. RSS is simply a file format similar to XML that is used by publishers to make their content available to others in a format that can be easily understood by web publishing software and content aggregators.

By using RSS feeds, you can enhance the content on your site without ever writing a single word. And remember, on the Internet, content is King.

Want to put Amazon products on your site, updated news from the New York Times, financial advice from Motley Fool, or press releases from PRWeb?

This is all possible with RSS. No matter what type of information you are looking for, RSS can provide you with a constant stream of updated content for your web site.

To search for an RSS feed to enhance your own website, go to Syndic8.com. You can even mix and match a variety of rss feeds at RSSMix.com.

Facts & Statistics

Looking for facts or figures to put on your website? Take a look at some of the sources below. You'll likely be surprised how many facts, figures, and definitions are available in the public domain.

The CIA World Factbook provides a number of statistics on countries, territories, and dependencies. Each profile tracks such demographics as population, ethnicity, and literacy rates, as well as political, geographical and economic data.

http://www.census.gov/: One of the largest repositories for data and statistics related to the U.S.

http://www.1911encyclopedia.org/Main_Page: One of the best encyclopedia's ever written was published over 90 years ago. Search over 40,000 articles, all of which are available for publication on your own site.

http://www.bibliomania.com/2/3/257/frameset.html: A searchable interface of the 1913 public domain Webster's dictionary.

Private Label Articles

Private label articles can be bought for pennies per article. This is possible because they are sold in bulk.

Many people criticize these articles and have declared them as worthless. However, I am here to tell you that private label articles can be very powerful when used appropriately.

Unless you have hours of free time every day, it is unlikely that you are going to be able to create the amount of quality content that your web site deserves. This is where private label content enters the picture.

You can use private label articles to:

  • Add content to your web site.
  • Acquire hundreds of inbound links by syndicating the articles to article directories.
  • Create a free report for your visitors and other website publishers.
  • Create an information-packed RSS feed.

However, the key to using private label articles effectively is to optimize them. Straight out of the box, these articles are near worthless. To give them value, you must add your own touch.

Inject your personality into the article. Combine multiple articles and do some additional touch-ups to ensure that the article is in top shape for your readers.

Once you are finished you can add your resource box and send it off to article directories and website publishers.

If you are looking for one of the top private label article providers, go to InfoGoRound.com.

Images

Quality images can make your content much more inviting and keep people at your site for longer periods of time. Fortunately, there is a site that offers thousands of pictures completely free of charge. Find it at Stock.XCHNG.

Quotes

Quotes can give your website a special touch. Quotes provide interesting content in addition to an element of credibility. I often like to add related quotes to my web site simply to engage the reader's attention.

To find some quotes for your website, go to QuoteLand.com.

In the end, you always want your content to be unique. Not for the search engines, but for your visitors. With quality content comes quality links. Once you have built up a reputation for delivering unique content, you will never have to worry about having an audience eager to visit your website.

Thursday, May 14, 2009

Meet Wonder Wheel & Google Squared - The Next Frontiers of Search



Source from Here
We may be coming upon a new era for the Internet search results. And, despite what you may think, Google is not the only player.

New search engines that are popping up across the Web strive to make searches faster, smarter, more personal and more visually interesting.

Some sites, like Twine and hakia, will try to personalize searches, separating out results you would find interesting, based on your Web use. Others, like Searchme, offer iTunes-like interfaces that let users shuffle through photos and images instead of the standard list of hyperlinks. Kosmix bundles information by type - from Twitter, from Facebook, from blogs, from the government - to make it easier to consume.

Google also showed off something it called the “Wonder Wheel,” a graphical way to explore topics by clicking on related searches that go deeper into the subject of the main query. It also showcased Google Squared, a tool designed to chart research into columns and rows for those who are trying to track and organize information they get from the Web. Google Squared will be available in Labs later this month.

Wolfram Alpha, set to launch soon, is more of an enormous calculator than a search: It crunches data to come up with query answers that may not exist online until you search for them. And sites like Twitter are trying to capitalize on the warp-speed pace of online news today by offering real-time searches of online chatter - something Google’s computers have yet to replicate.

As of this writing the new options aren’t available to everyone, but Google says they should be available to all users in the next 24 hours. These new features will be very helpful to students in need of assistance in refining searches or help in thinking of alternative search terms.

Wednesday, May 6, 2009

US President Barack Obama Has Targeted Outsourcing Industry Once Again - NASSCOM


US President Barack Obama has targeted outsourcing once again in his attempt to bring back the ailing US economy on track and the clouds seem to be getting darker for Indian IT companies.

Indeed, Obama’s proposal to limit U.S. companies’ ability to defer paying U.S. taxes on offshore earnings does put Bay Area companies doing a lot of business overseas directly in the crosshairs. “It would adversely impact our ability to invest and grow our business in the (United States) and to compete against our foreign competitors,” said a spokesman for Cisco.

Google, whose CEO, Eric Schmidt, is supposed to be a close buddy of Obama’s, said it is “too early to evaluate the potential effect on Google’s operations, as there will likely be multiple proposals considered.”

IT body NASSCOM said it is still reviewing the tax proposals announced by Obama. However, “prima facie, the proposals appeared to be aimed at addressing the tax rate differentials that exist across the world and if implemented, this would impact American headquartered companies with overseas operations,” it said.

source from here

Monday, February 23, 2009

Adam Lasnik, Webmasters Search Camp

Adam Lasnik, Google’s first Search Evangelist who broadens the company’s online & offline communications with all Webmasters will be part of Search Camp, and will be answering participant questions and doubts on SEO. Don’t miss this golden opportunity...

just register for Search Camp now
Visit: http://searchcamp.in/

Thursday, February 19, 2009

What Is SEO?

What Is SEO? (Search Engine Optimization)
Search Engine Optimization is not just getting top ranking on SE’s. SEO is a process of making the web site (pages) search engine friendly. The main aspire following a good SEO is getting good number of visitors to the site and convert them to sales.

A good SEO is that which fetch sales are not the ranks. It is a widespread practices that if your website ranks on top of the searches the more visitors you get. If your website ranks good in SERP’s and you have quality information and services you will get more sales.

SEO is the way to drag huge amounts of free traffic from SE. sure; you need to find an efficient search engine optimization company to make it happen.




Search engines frequently index millions of pages for certain target keywords, and website can be hidden deep on page 100 or not as good as if it is not optimized correctly.

Thursday, February 12, 2009

Black Seo Techniques - The Harmful techniques for any website.

Black Hat search engine optimization is customarily defined as procedure that are used to get higher search rankings in an unethical manner. These Black Hat search engine optimization techniques are deemed to be sneaky and deceptive and generally practices that are not accepted by the search engines or to break search engine guidelines, such as doorway pages,content generating, keyword stuffing, misrepresenting content, buying links, spam indexing, cloaking, link baiting, data mining, link spamming, and even page rank hijacking.We should not use these black hat seo techniques.

Depending on the competitiveness of a keyword you can see results very quickly. Sometimes within a day - for more competitive terms you can see placement within a few weeks.However,SEO Blackhat is watched closely by the search engines and shows the bad results who use black hat techniques.Black Hats see search engines as the enemy; deceit is the name of the game, seo’s are constantly looking to expose trickery and penalize guilty sites.and finally your site will be banned. Being banned means your site is removed from the index completely from the search engines and the rank gained by these tactics will be finally reduced.

Doorway Page is a procedure of highly optimized web page whose purpose is to direct traffic to target pages using either a code like redirect method or merely by being full of links that direct you to these other web pages and stays as representative to other pages.

Keyword stuffing literally the practice of repeating a keyword in high rate again on a given web page. If search engine suspects there is a good chance they’ll penalize you for it.

Interlinking Setting up various websites for some topic and having them all weblink to each other in order to boost up their relevance and subsequently their rankings in the search engines.

Buying Expired Domains buying expired websites that had some good Page Rank in order to try and keep the sites inbound links.

Cloaking will be technique where a webmaster will have two different copies of a given web page, one copy of page which is shown the search engine spiders and one copy of page is shown to the regular web surfer means showing one page to Googlebot and a completely different page to real human visitors.

Selling Page Rank if you have a high PR website, you sell links from your website to another for money. This in turn leads to boost up the site rank higher in Google.

Invisible Text is the technique of filling a web page consisting of the text that is same color as the background ex: white text on a white background.

Spam Page This page which is encorporated with a bunch of ads or listings of other sites as doorway pages that a webmaster makes money off of if someone clicks on them.

Tuesday, February 10, 2009