1. How They Work and Why They Never Remain the Equivalent
2. An Introduction Of How Google 'Crawls' The Net And How It Indexing Your Website.
3. Keyword Quantifier - Are You Getting the Right Impressions?
4. Keywords for Search Engines
5. Searching Strategies
6. How to Defend your Website from the Google Duplicate Proxy Exploit
7. 10 Most Valuable Free Google Marketing Tools
1. How They Work and Why They Never Remain the Equivalent
Here is when web crawlers like Bing, Google or Yahoo! become fundamental. Without them, we would get totally lost in this immensity of online information, the measure of which gets greater every day. Web crawlers sort of light our way in obscurity waters of the Global system, and salvage us from getting suffocated in the unexplored.Why web search tools can reveal to one thing from another and help us find things? Here is the arrangement. What web crawlers do throughout the day is they investigate. They, so to state, slither the Web, utilizing internet searcher bots otherwise called crawlers for those reasons. And, they register everything that search engine bots find in their database called index. This is how they know stuff. Now, how come they manage to provide exactly what you are looking for? Thing is they evaluate sites according to different criteria and use their findings as signals that help them group websites and return relevant search results.
What are those signals? Search engines look at how many times one and the same phrase is repeated on the webpage, whether its content is unique or stolen from some other site, how many people like the website and declare it by linking to it or sharing it on social media sites, etc.
Every search engine uses its own algorithm according to which it picks out sites it considers relevant for a particular search term. It then ranks those sites, displaying most relevant and best sites first, and least relevant and worst sites last. Therefore, if you think about it, what you see in search results is based on how search engines make up their artificial minds.
This is what I mean when I say "the power search engines have over your search". In a lot of ways, you look at the Web through search engine's eyes. And there is basically no way around it.
But, along with having substantial power over your search results, search engine also face problems. And their biggest problem is the ever growing bulk of info on the Web, and low-quality websites that want to seem nice and relevant, while in fact they are spammy and fraudulent.
Thus, search engines wage an ongoing struggle to keep their search results nice and relevant, while search engine optimizers who practice dark Web marketing arts hone their skills of search engine cheating. That is why search engines never stand still - they constantly evolve. Some search engines change their algos once a week, some do it less often. Sometimes, search engines rolls out major updates that substantially change their algos. This normally sends ripples all over the Web, and sometimes even affects relevant sites' positions in a bad way.
Thing is not all site owners and Internet marketers who want higher position in search results pages are cheaters. Most of them make really nice, relevant, helpful websites and are not trying to look what they are not. So, as search engine algos change very often, there is a continuous race between search engines and those who seek their favor - site owners, webmasters, SEOs and Web marketers.
Thus, if for most Internet clients hitting "Search" is only a little thing, for web indexes and Web advertisers it's a continuous battle in which the two gatherings some of the time win and some of the time lose, yet the triumph is never last.
BACK TO TOP
2. An Introduction Of How Google 'Crawls' The Net And How It Indexing Your Website.
There are various reasons why companies want their website appear top on SERP's (search engine result pages). Moreover, there are different courses in how organizations can get themselves to the top. Websites ranking in Google search results are partly based on analysis of those sites that link back to the original site. The amount, quality, and significance of connections additionally tally towards the rating. But the question is, how does Google collate this data?
Google uses something called a 'GoogleBot' to index or 'crawl' the whole web. Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. They use thousands of machines to 'crawl' billions of pages on the web. Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site. Googlebot's crawl process begins with a list of webpage URLs, generated from previous crawl processes and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links (SRC and HREF) on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.
For most sites, Googlebot shouldn't access your site more than once every few seconds on average. However, due to network delays, it's possible that the rate will appear to be slightly higher over short periods. In general, Googlebot should download only one copy of each page at a time. If you see that Googlebot is downloading a page multiple times, it's probably because the crawler was stopped and restarted. Googlebot was designed to be distributed on several machines to improve performance and scale as the web grows. Also, to cut down on bandwidth usage, we run many crawlers on machines located near the sites they're indexing in the network. Google's aim is to crawl as many pages from your site as we can on each visit without overwhelming your server's bandwidth.
Google gives a great deal of guidance for helping web engineers to improve positioning positions. On their webmaster central page, Google state 'The best way to get other sites to create relevant links to yours is to create unique, relevant content that can quickly gain popularity in the Internet community. The more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. Before making any single decision, you should ask yourself the question: Is this going to be beneficial for my page's visitors? It is not only the number of links you have pointing to your site that matters, but also the quality and relevance of those links. Creating good content pays off: Links are usually editorial votes given by choice, and the buzzing blogger community can be an excellent place to generate interest.'
BACK TO TOP
3. Keyword Quantifier - Are You Getting the Right Impressions?
So you've spent time doing all the normal things to check your potential competition and rewards if you get the top rankings in Google for your keywords, but you've noticed that the estimates you got don't line-up with the reality of the traffic you end up getting - what's up with that?
For the last 10 years I've been testing different indicators for what to expect from a certain amount of traffic generation effort and to be honest with you - I'm still often surprised at the difference between what the tools and systems tell me and the traffic I see once I have those number 1 spots in Google.
Most people taking a quick look for their competition start with a search on Google and check the number of results, most website owners realize that this includes a LOT of irrelevant sites and will then check with "quotes" around their search term, this usually brings that number down a lot and is more realistic.
As time has gone on Google in particular has been placing more importance on certain aspects of their results algorithms and focusing much more on local results and on speed on results.
If your website takes too long to load you're going to get less traffic - it's a fact.
So, when you're doing your Google search what you see may not be as close to reality as you think. To most online marketers this is not new information and most people realize that (as I mentioned above) Google don't show you as much as some other systems. This is not necessarily because they're not seeing it - but because they don't want you to know about everything they see. They may tell you they only see 4 links to your site but they'll still rank it as though they see all 6000 (or however many you actually have). It's no surprise that you have to take their information with a pinch of salt.
People are always trying to 'game' the search engines and so they naturally respond by hiding what they're doing.
You may think you've got around some of this in the past by using the Allintitle: modifier in your searches (I told you to use this in my "knowing where to tap" ebook a few years ago) and it does help, but I want to make sure you understand that you're never going to have the true picture until you get that number one spot and see what your listing brings you. Even then, you can change that number by tweaking your page title and description tags to do normal copy conversion testing for your listing.
One of the things I spoke about last year was the apparent disparity in Google's results when you see the number of pages it says it has results for and then when you click 'next' until you get to the end - often it will tell you there are thousands of pages but actually only take you to 200 and then end the results - this was always one of those weird things that people comment on but left you wondering why they told you there were so many but when you tried to see them all - most of them disappeared somewhere.
Weird....
Google's apparent reason was that they'd removed all but 'the most relevant' results for you.
If that's true and Google doesn't consider the other results meaningful it certainly explains why I have blogs that are number 1 for terms with 100,000,000 results - but that I've done no real work to market. The actual competition is obviously much lower in Google's reality.
Since Google are removing most of the results and only showing the top 1000 - you're likely to only ever see a number between 1 and 1000 in those quoted searches. To make the search quicker you can just add them to your quoted search and you'll get Google's suggestion of the sites that are relevant - i.e your real competition.
Give it a try for your niche and see what you get.
I can tell you that it's different to what most other methods will show you - and I still don't believe it's completely accurate but it does seem better than without doing the extra check.
The result of all this is that you then need to decide whether you can compete with the results that do show up.
I'd like to tell you that there's a logic to the number of results and how easy it is for you to compete but I've found that it's not so simple -it all comes down to the niche - and your ability to get good links.
In some niches you'll see 400 results and the top ones are still all low page rank sites that you know you can beat, in other niches they'll be high page rank sites that you have almost no chance of competing with.
So you need to do some testing of your own to be able to tell what amount of work YOU are likely to need to do considering your abilities and previous results.
I like to have a good idea about what I can achieve before I start out and luckily for me - I've been doing this for over 10 years now so I have a good idea of what I can achieve, especially considering I have my own network of sites I can get links from right away and start making a difference from day one.
If you're just starting out it'll be a little different for you and you'll need to track your efforts and results in order to get a benchmark of where you can hit those juicy traffic spots.
If you're completely new you probably don't have much idea about this so you're probably best off just doing what I've told you in the past and checking out the top page of results, looking at their page rank (use the seoquake addon for firefox to make this simple) or use a tool like Market Samurai - see how many links, what the page rank is and use the search modifiers like I've said to see how optimized the existing results are to get a feel and then combine that with what you know about your abilities to get ranked in the past.
My favorite strategy for gauging what to focus on keyword-wise is to do some research and then gauge 'intention' by running some Adwords ads and seeing which keywords actually get traffic and turn into buyers.
I've said it before but I'll say it again - It often surprises me how different what I would imagine would be the best converting phrases differs from what keywords actually convert into buyers.
It would be a shame to see you waste your time ranking top for a phrase you 'thought' would be a buying phrase and not focus on other phrases that in reality ARE buying phrases - so please if you can - run some PPC campaigns and get that traffic to your site before you put a lot of time and energy into ranking for it and then find disappointment because the traffic doesn't convert.
Unless you're just after brand awareness - you probably want the keywords and phrases that actually convert into buyers.
BACK TO TOP
4. Keywords for Search Engines
Search engines are great tools and can drive a lot of traffic to your website, but you have to use them actively. For potential visitors to find your website, it must show up when they do a search. Create "signs" (keywords) that point a roadmap to your website. Proper keywords must be selected for your site and by this, I mean the actual content on your website should include the same words that people would be typing in to a search engine. When the page contains the same words that people are searching for it becomes more relevant.
Use the words people are looking for. Being more specific will mean your site is more relevant when people perform a search. Your keywords would be the starting point of your whole marketing strategy. If chosen incorrectly, your target audience may never find you.
Finding the keyterms about your product that people are looking for can be tricky. Take care with the terms that you use because, unlike you, they don't know as much about the product so they may use different terms when typing in to find you. Using online tools to find keyterms relevant to your niche is important. The Word Tracker service and both Google ad terms and Yahoo keyword tools should be used. These tools are based on actual searches being performed by the people looking for your product or service so they are valid measures to use.
Each word in the keyword list must be evaluated as to its potential effectiveness. A quick search for the keyword will show the number of results that appear so you can assess its popularity. The services discussed above will also assist you at this. More popular search terms are more likely to be entered into the search engines so you want to become relevant for those. It is especially important that you put yourself in the place of the customer and set up your website so that someone who doesn't know anything about you will be able to find you.
BACK TO TOP
--------------------------
5. Searching Strategies
Identification
Successful Internet searching begins even before you touch your keyboard or click on your mouse – it’s when you consider what type of search you want to carry out.
If you’re performing a very specific search - a company, product, or name of a person, then using a directory like Yahoo generally returns a small number of highly relevant pages.
When you’re looking for something more unusual, like a friend’s name, then you’ll want to perform the widest possible search. In this case, a meta search engine like metacrawler is useful, interrogating several search engines for you with a single query.
With the Internet numbering more than 100 billion pages, it can take a few weeks before anything new is indexed by the search engines. If you’re looking for information on a current topic, try searching the newsgroups instead.
For everything else use Google. It’s uncluttered with adverts, has a large database of pages, and returns very relevant results – everything you need in a great search engine.
Choose Keywords
But, even the best search engine is only as good as the keywords you enter; choose them wisely to get the best results. The best approach is to be very specific, right from the start – enter up to five keywords that relate to what you want to find. For example, you live in Florida, USA and want to place a classified ad for an old computer you are selling, or you just upgraded your machine and want to get rid of the old hardware which still works perfectly and you’d like to earn some cash back. Go to Google and type in the search box; Florida Computer Hardware Classifieds. You’ll be presented with the most popular classifieds to advertise your particular hardware.
Sometimes you’ll get no pages returned, but that’s fine, you can always remove a word or two and search again. When choosing which keywords to use, try to think about the different ways in which people might present information. If you’re looking for reports on an error message, you need to allow for the possibility that anyone posting a fix for the problem will abbreviate the message. Enter only the most important parts.
Remember, that most of the Internet is still American, and the spelling changes accordingly. Bear that in mind when you’re searching.
Learn the Rules
Every search engine has its own syntax, but there are some common rules that work just about everywhere. For example, placing quotes around a group of words such as “Florida Computer Hardware Classifieds” ensures you’ll only find pages that contain those two words together, rather than “Florida” in one paragraph, and “Hardware” somewhere else. You could add words to be more specific – “Florida Computer Hardware Classifieds” desktop – but some search engines may then return sites that contain just one of your search terms. Add a “+” before each one, like +’Florida”+”Computer”+”Hardware”+”Classifieds”+”desktop” to tell the search engine that they are all required. One remaining problem with our search example is we’re using upper-case letters. This tells most search engines that we want a case-specific search, so searching all in lower case is usually the best policy.
Advanced Strategies
Your final step to search engine mastery comes with a little lateral thinking about how the information you need is likely to be presented. Perhaps you want to learn HTML to build your own web page. Searching for ‘html’ in Google will return 250 million pages, which is less than useful. Try ‘beginners guide to html’, and that’s cut down to 500 thousand and you’re likely to find what you need in the first 20.
After some thought, you might decide to use an HTML editor instead. But which one is best? You could try searching ‘html editors’ but, as a phrase, that might be used in lots of irrelevant places. Much better to search for a few specific editor names, such as ‘dreamweaver’, ‘frontpage’, ‘webweaver’. You should immediately find a number of pages comparing the different products.
Your best bet is to experiment and see what happens. Improving your search engine technique is the simplest way to get more out of your Internet connection, and any time you invest learning new tips will be speedily repaid.
BACK TO TOP
6. How to Defend your Website from the Google Duplicate Proxy Exploit
There is a current and active way to knock a website out of Google's search engine results. It's simple and effective. This information is already in the public domain and the more people that know about it, the more likelihood there is that Google will do something about it. This article will tell you how it works, how to get a website knocked out of the search engine rankings, but most importantly, how to defend your own website from having it happen to you.
To understand this exploit, you must first understand about Google's Duplicate Content filter. It's simply described thus: Google doesn't want you to search for "blue widget" and have the top 10 search terms returned copies of the same article on how great blue widgets are. They want to give you ONE copy of the Great Blue Widget article, and 9 other different results, just on the off chance that you've already read that article and the other results are actually what you wanted.
To handle this, every time Google spiders and indexes a page, it checks it to see if it's already got a page that is predominantly the same, a duplicate page if you will. Exactly how Google works this out, nobody knows exactly, but it is going to be a combination of some or all of: page text length, page title, headings, keyword densities, checking exactly copy sentence fragments etc. As a result of this duplicate content filter, a whole industry has grown up around trying to get round the filter, just search for "spin article".
Getting back to the story here, Google indexes a page and lets say it fails it's duplicate content check, what does Google do? These days, it dumps that duplicate page in Google's Supplemental Index. What, you didn't know that Google have 2 indexes? Well they do: the main one, and supplemental one. 2 things are important here: Google will always return results from their Main index if they can; and they will only go to the Supplemental index if they don't get enough joy from their main index. What this means is that if your page is in the supplemental index, it's almost certain that you will never show up in the Search Engine Ranking Pages, unless there is next to no competition for the phrase that was searched for.
This all seems pretty reasonable to me, so what's the problem? Well there's another little step I haven't mentioned yet. What happens if someone copies your page, let's say your homepage of your business website, and when Google indexes that copy, it correctly determines that it's a duplicate. Now Google knows about 2 pages that it knows are duplicates, it has to decide which to dump in the supplemental index, and which to keep in the main one. That's pretty obvious right? But how does Google know which is the original and which is the copy? They don't. Sure they have some clever algorithms to work it out, but even if they are 99% accurate, that leaves a lot of problems for that 1% of times they can get it wrong!
And this is the heart of the exploit, if someone copies your websites homepage say, and manages to convince Google that *their* page is the original, your homepage will get tossed into the supplemental index, never to see the light of day in the Search Engine Ranking Pages again. In case I'm not being clear enough, that's bad! But wait, it gets worse:
It's fair to say that in the case of a person physically copying your page and hosting it, you can often get them to take it down through the use of copyright lawyers, and cease and desist letters to ISP's and the like, with a quick "Reinclusion Request" to Google. But recently there's a new threat that's a whole lot harder to stop: the use of publicly accessible Proxy websites. (If you don't know what a Proxy is, it's basically a way of making the web run faster by caching content more local to your internet destination. In principle they are generally a good thing.)
There are many such web proxies out there, and I won't list any here, however I will describe the process: they send out spiders (much like Google's) and they spider your page, take your content, then they host a copy of your website on their proxy site, nominally so that when their users request your page, they can serve up their local copy quickly rather than having to retrieve if off your server. The big issue is that Google can sometimes decide that the proxy copy of your web page is the original, and yours is not.
Worse again, there's some evidence that people are deliberately and maliciously using proxy servers to cache copies of web pages, then using normal (white and black hat) Search Engine Optimization (SEO) techniques to make those proxy pages rank in the search engine, increasing the likelihood that your legitimate page will be the one dumped by the search engines' duplicate content filters. Danger Will Robinson!
Even worse still, some of the proxy spiders actively spoof their origins so that you don't realise that it's a spider from a proxy, as they pretend to be a Googlebot for example, or from Yahoo. This is why the major search engines actively publish guidelines on how to identify and validate their own spiders.
Now for the big question, how can you defend against this? There are several possible solutions, depending on you web hosting technology and technical competence:
Option 1 - If you are running Apache and PHP on your server, you can set the webhost up to check for search engine spiders that purport to be from the main search engines, and using php and the .htaccess file, you can block proxies from other sources. However this only works for proxies that are playing by the rules and identifying themselves correctly.
Option 2 - If you are using MS Windows and IIS on your server, or if you are on a shared hosting solution that doesn't give you the ability to do anything clever, it's an awful lot harder and you should take the advice of a professional on how to defend yourself from this kind of attack.
Option 3 - This is current the best solution available, and applies if you are running a PHP or ASP based website: you set ALL pages robot meta tags to noindex and nofollow, then you implement a PHP or ASP script on each page that checks for valid spiders from the major search engines, and if so, resets the robot meta tags to index and follow. The important distinction here is that it's easier to validate a real spider, and to discount a spider that's trying to spoof you, because the major search engines publish processes and procedures to do this, including IP lookups and the like.
So, stay aware, stay knowledgeable, and stay protected. And if you see that you've suddenly been dumped from the Search Engine Rankings Pages, now you might know why, how and what to do about it.
BACK TO TOP
7. 10 Most Valuable Free Google Marketing Tools
Google has become the dominant search engine on the Internet. It would be hard to imagine a web without Google. For that matter, it would now be hard to imagine a world without Google. As frightening as that may seem to many people, it is none the less true.
For better or worst, Google has permeated into almost every aspect of our everyday life. Being Googled is now a common expression and an act carried out by millions of users around the world each day. New Google products and services are coming on stream at a frightening pace, further increasing Google's impact on our lives.
Despite this dominating presence, many people still don't realize Google offers some excellent free marketing tools for marketers and webmasters. Marketing tools which can prove extremely valuable to any webmaster or marketer trying to promote their sites or products online. Useful tools that will make your promotions easier and much more profitable.
Don't be fooled by the 'free' label, these marketing tools might be free but there are also valuable. One even wonders why Google would be giving away these tools and services for free? It probably makes good business sense in the long run, by providing these free tools Google is fostering a lot of company good-will and building up the Google brand name in the process. Good PR is good business.
Every marketer and webmaster should be taking advantage of Google's good-will and snapping up these professionally run services and marketing tools. Here's a quick run-down of the 10 most valuable free Google Internet marketing tools:
1. Google Analytics
Perhaps the premier marketing tool offered by Google. It will prove helpful to both the marketer and the webmaster. Google Analytics gives you a daily snapshot of your web site. Google Analytics analyzes your traffic, where it comes from and what it does once it enters your site. You can monitor up to three sites for free.
Google Analytics is extremely valuable in analyzing your marketing funnel, it tracks all the steps leading up to your sales or checkout page. Vital information for raising your conversion rate and ROI.
You may be placed on a waiting list for this highly in demand service from Google.
LINK: http://www.google.com/analytics/
2. Google Sitemaps
Webmasters can use Google Sitemaps to almost instantly place newly created pages on their site into the Google Search Index. This is an XML file that is uploaded to Google as new pages are added on your site. Needless to say this can be a valuable service for any webmaster or marketer who wants to get their information on the web quickly.
LINK: https://www.google.com/webmasters/sitemaps/docs/en/about.html
3. Google Alerts
Be notified when someone or another site lists your site or mentions your name. Great way to keep track of all your online activities. Great way to monitor all your online business interests and products.
LINK: http://www.google.com/alerts
4. Google Froogle
Froogle is Google's price directory! It simply lists all the cheapest prices for different products on the web. For marketers and webmasters who are promoting products, it should be studied and analyzed. Optimizing your site's content for Froogle may prove to be very beneficial.
Follow Froogle or Google directions exactly on how to list or display products on your site. Froogle will spider your site and display your prices and products to thousands of targeted customers. That, as they say, is priceless.
LINK: http://froogle.google.com/
5. Google Checkout
Not exactly free but for those marketers who use AdWords - for every $1 spent on AdWords you can process $10 for free. You can also place the shopping cart logo on your AdWords ad and take advantage of the prestige and trust the Google brand name has built up.
Over time marketers may find this tool to be very effective and valuable.
LINK: https://checkout.google.com/
6. Google eBlogger
Blogging has become vitally important to the health and functioning of your web site. No site should be without at least one blog and RSS feed. Creating a blog (online journal) on the topic of your web site or product will bring in extra traffic and targeted customers. eBlogger is a simple free blogging service that even lets you publish or post your blog files to your own web site server. Keep in mind, each blog has that all important Google Blog Search bar.
LINK: http://www.blogger.com/
7. Google Toolbar - Enterprise Version
Try the new enterprise version of the Google Toolbar for your company or business. Integrates countless features with all your employees or corporate network. These could include a common customer database, company calendar, financial news...
Keep in mind, Google also ranks every page it indexes on a scale of 0-10. While it is important to know the Page Rank of your own pages, it is even more important to know the PR of your competitor's pages. You can use the toolbar to get the PR of each page you're visiting. Extremely helpful information for webmasters and marketers to know when forming online linking or business arrangements.
LINK: http://toolbar.google.com/T4/enterprise/
8. Google Groups
Every marketer knows the important of having a large contact list of people with a similar interest. Social networking will play an ever increasing role in your success on the web. Just look at the growing popularity of sites like MySpace and LiveJournal.
Google groups is another form of social and business networking that every marketer should be aware of and pursuing.
LINK: http://groups.google.com/
9. Google Adsense
One simple way to monetize your web content is to use Google Adsense. Just place the Adsense code on your site and receive a check from Google each month. For webmasters who are not really into online marketing (does such creatures exist?) Adsense can be a painless way to earn extra income from your site.
For professional marketers using the Adsense system can supply a tremendous amount of marketing information on the keywords in their particular niche. It keeps the marketer informed on what keywords are being bid on and how much advertisers are willing to pay.
Adsense also has an excellent real-time tracking system you can use to keep track of all your important web pages.
LINK: https://www.google.com/adsense/
10. Google Writely
A recent addition to Google's stable of free products. Writely is a full featured online writing editor with spellcheck and great collaborating features. It also lets you publish your content directly to your blogs. One feature that may be of interest to marketers, it lets your save files in the popular download format of PDF.
Lets face it, until video takes over the web in four or five years time - the written word is still king on the net. It is the medium that markets, promotes and sells your content or products. Writely will help you write better.
LINK: http://www.writely.com
Honorable Mention - Google Trends
This Google program will let you search popular trends, important for marketers searching for the latest hot product to promote. You can also break down these trends by different regions.
LINK: http://www.google.com/trends
Final Note
Please take note that signing up for a Google account will usually help you in obtaining most of these free services or programs. Some of these programs may have to be applied for individually. But be assured, all these free Google marketing tools are well worth your time and effort. They will make your marketing easier and they will help any webmaster or marketer run their online business more efficiently.
BACK TO TOP
Source : articlesphere