Internet is not a business plan
•Simply having a website or online business is not a business model in itself
•The Internet is simply a medium through which you can market sell and communicate with your customer.
•Online Marketing is still hard work and takes time – minimum 3 months.
What is Online Marketing?
Is combining traditional marketing with Internet technology to improve sales, leads etc.
Having a website is not considered Online Marketing.
Your website should serve as a virtual office that provides information collects leads and sells products.
The web offer incredible potential for marketing sales customer feedback and quantifiable results.
Simply having a website and doing pay-per-click is not enough.
Search Engine Optimization (SEO)
Optimize web pages for better indexing in search engines to improve ranking of web pages and prominence in search results
Increasing the websites Link Popularity and Web Ranking through Search Engine Submissions, Directory Submissions, and Link Building Campaigns.
Create search engine prominence by making the website a trusted authority on relevant subjects by submitting Press Releases, Articles, and other Marketing Materials.
Paid Online Advertising (PPC)
Generating immediate traffic by establishing an effective and targeted Pay-Per-Click campaign.
Ethical Search Engine Marketing
SEO companies – There are many SEO companies that will guarantee results – I feel this is unethical and decietful
No Body controls the Internet or the Search Engines
Be wary of anyone who guarantees results
Organic traffic, as the name implies, is traffic that comes to your Web site naturally and without being driven there by a specific marketing campaign. In essence, Web site visitors are there because they found the site and thought it had something they wanted. And like anything organic, organic traffic isn’t there instantly; it takes time and nurturing to grow into something healthy and with longevity.
Organic traffic happens in the same way that you might browse the bookshelf at your local library or bookstore for something in a specific area of interest and find that little treasure that contains all the answers you’re seeking. This is, more or less, what the Web was all about when it was first created. Sources of organic traffic include:
referrals from other Web sites (links)
referrals from search engines, and
URLs placed on letterhead, business cards, etc.
Paid Web Site Market
Paid Web site marketing has the advantage of driving traffic immediately to your Web site. This is great for launching a site, or for a special promotion. Popular paid options include (but are not limited to):
Newspaper magazine and TV ads
Purchasing of banner ads on other Web sites
Launching a Search Engine Marketing (SEM) campaign, and
Distributions of mass emails and press releases
A secondary benefit of paid Web site marketing is that when done properly, it can help lay the seeds of organically generated traffic.
Short Term and Long Term Goals
Organic Search Engine traffic takes time. Time is one of the most important factors in the ranking of a website. Older domains have more weight than newer domains.
PPC creates instant traffic… at a price.
Online Marketing Strategy
Rather than employ simply one technique over the other.
Online marketing strategy would include both SEO and PPC, as well as traditional marketing ie print ads, tv/radio promotion, mailers, and networking.
About Search Engines
Spider “crawls” the web to find new documents (web pages, other documents) typically by following hyperlinks from websites already in the database
Search engine indexes the content (text, code) in these documents by adding it to their huge databases and periodically updates this content
Search engine searches its own database when user enters in a search to find related documents (not searching web pages in real-time)
Search engine ranks resulting documents using an algorithm (mathematical formula) that assigns various weights to various ranking factors
What are Spiders/Bots?
tiny, little, software programs that can scour millions of web sites per second and retrieve links to relevant information based on your query.
Design for Robots and Humans?
The first rule to remember is that robots can not read graphic images, and they can not read flash animation. Keywords need to scripted and written and strategically placed on the code side of sites that use flash and graphic-only home pages or “landing pages.”
META tags and DC tags and other such directing tags in the code all help and all are necessary.
What criteria are evaluated when we consider “what a robot sees?” The robot will look at the density of KWPs in the home page text, and at the interior pages and their urls. The robot will check where the navigation bar links take us, and if the robot’s expectations are met, and if the site has credibility (meta information, links).
In designing a web site, save the flash animation for after you have their attention and they have made a commitment to you and your site.
How Search Engines Operate
Search engines have a short list of critical operations that allows them to provide relevant web results when searchers use their system to find information.
Crawling the Web
Search engines run automated programs, called “bots” or “spiders” that use the hyperlink structure of the web to “crawl” the pages and documents that make up the World Wide Web. Estimates are that of the approximately 20 billion existing pages, search engines have crawled between 8 and 10 billion.
Once a page has been crawled, it’s contents can be “indexed” – stored in a giant database of documents that makes up a search engine’s “index”. This index needs to be tightly managed, so that requests which must search and sort billions of documents can be completed in fractions of a second.
When a request for information comes into the search engine (hundreds of millions do each day), the engine retrieves from its index all the document that match the query. A match is determined if the terms or phrase is found on the page in the manner specified by the user.
Once the search engine has determined which results are a match for the query, the engine’s algorithm (a mathematical equation commonly used for sorting) runs calculations on each of the results to determine which is most relevant to the given query. They sort these on the results pages in order from most relevant to least so that users can make a choice about which to select.
Title Tags, Meta Tags, and Keywords
Search Engines reveal website based on keywords. Identifying keywords that your visitors would use to find your product or service. Create a list of keywords that you would like your website to appear in search engine results.
Optimize your website by adding your keywords to your Meta Tags Links and in content.
We typically recommend only going after three or four related keywords per page (five if you can balance them properly). Any more than that and you run the risk of diluting your page to the point where you rank for nothing. Make sure to naturally work the keywords into your content and avoid over-repetition that may be interpreted as spamming. Your content should never sound forced.
Your on-page content isn’t the only place where you can insert keywords. Keywords should also be used in several other elements on your site:
Meta Description Tags
Meta Keywords Tag
Anchor Text/ Navigational Links
You’ve spent a lot of time molding your keywords; make sure you use them in all the appropriate fields to get the maximum benefit.
Keyword Density and Relevance
Keyword density is the measurement in percentage, the number of times a keyword or phrase appears compared to the total number of words in a page. In the context of search engine optimization keyword density can be used as a factor in determining whether a web page is relevant to a specified keyword or keyword phrase. Due to the ease of managing keyword density, search engines usually implement other measures of relevancy to prevent unscrupulous webmasters from creating search spam through practices such as keyword stuffing.
Link Text – Outbound Links
Content is King
On-Page Factors (Code & Content)
Title Tags The title element of a page is meant to be an accurate, concise description of a page’s content. It creates value in three specific areas and is critical to both user experience and search engine optimization.
ALT image tags
Content, Content, Content (Body text)
Keyword frequency & density
What is Link Popularity?
Every major search engine uses link popularity as a part of their ranking algorithms to some extent. As such, an effective link popularity strategy plays a critical role in improving your website rankings. Popularity is becoming more widely used by search engines. Therefore, it is a very important consideration when trying to increase your ranks. Hypothetically, if you have two sites that have equal content and equal meta tags, then the site with more popularity will rank higher.
The link popularity of your site is determined by:
The number of web sites that are linking to your site (the web page that your link appears on, must be indexed by the search engine – just using those “free-for-all-links-pages” is not the kind of link that you’re looking for.), The popularity of the sites are that are linking to your site and the similarity of the content on sites that link to your site. Link popularity is used by every search engine to some extent. Ways to increase link popularity will be discussed below.
Link Popularity measures the quantity and quality of sites that link back to your website. For example Google uses its ranking algorithm called “Page Rank” to determine the relevancy of each website. Naturally, the higher the relevancy of the website the better is its listing in the Google results.
Search Engine/Directory Submissions
Increase link popularity by submitting to search engines and directories.
Sitemaps – Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.
An inbound link is a hyperlink transiting domains. Links are inbound from the perspective of the link target, and conversely, outbound from the perspective of the originator. Inbound links were originally important (prior to the emergence of search engines) as a primary means of web navigation; today their significance lies in search engine optimization (SEO).
In addition to rankings by content, many search engines rank pages based on inbound links. Google’s description of their PageRank system, for instance, notes that Google interprets a link from page A to page B as a vote, by page A, for page B. Knowledge of this form of search engine rankings has fueled a portion of the SEO industry commonly termed linkspam, where a company attempts to place as many inbound links as possible to their site regardless of the context of the originating site.
Increasingly, inbound links are being weighed against link popularity and originating context. This transition is reducing the notion of one link, one vote in SEO, a trend proponents hope will help curb linkspam as a whole.
Also – reciprocal link is a mutual link between two objects, commonly between two websites to ensure mutual traffic.
Website owners often submit their sites to reciprocal link exchange directories, in order to achieve higher rankings in the search engines. Reciprocal linking between websites is an important part of the search engine optimization process because Google uses link popularity algorithms (defined as the number of links that led to a particular page and the anchor text of the link) to rank websites for relevancy.
Identify Keywords that visitors would use to find your website.
Popular Keywords VS Competitive
What relevant keyword phrases are used to find your website?
How many times is that keyword phrase queried?
Popular keywords are usually also competitive keywords.
Try to find identify the keyword phrase that is popular and less competitive
Analyze your Competition
Who appears on the first page for your keyword phrase?
What meta tags are they using?
What keywords appear on the page?
Who is linking to them? Link: domainname.com
Content Is King!
Have unique fresh content will generate traffic naturally. Writing compelling and interesting articles of stories will attract both visitors and bots.
Keywords should be relevant – Consider your customer – Don’t use too much industry jargon
Use a Keyword tool
Broad and targeted keywords. You’ll need both to rank well.
Broad terms are important because they describe what your web site does; BUT they won’t increase qualified traffic coming into your site.
Popular VS Competitive
Keywords are usually bid on. More competitive keywords usually require higher bids in order to receive better placement. Better placement is also dependent on the effectiveness of your ad and the relevance of the keywords to the users search.
Google’s Adwords offers a great tool that will help you identify new keywords and see the average click per day.
Here are two ways to calculate how much you should bid on a keyword for a particular item. For a basic, per-sale return-on-investment (ROI) calculation, you have to determine how much each product you sell is worth.
The online conversion rate from web site visitor to customer. How many of the people visiting your site through a link in your ad actually make a purchase?
The average overall dollar value of each customer. How much do those customers spend? Is it enough to cover your advertising expenses?
These factors can depend heavily on your company’s products and services. For example, if you have very consumer-oriented, competitive products, the conversion rate may be quite high while the per-sale profit may be relatively low. On the other hand, if you offer a higher-priced, specialized business service, your conversion rate may be low, but the profit-per-conversion could be high.
Analyzing that data will help you decide how much to spend to attract the visitors that are most likely to turn into customers. If you have relatively low conversion rates, but high average dollar value, you may still want to bid high enough to achieve placement at the top of the search engine results because keeping the traffic level up has long-term value.
On the other hand, if your conversion rate is high — usually because the site sells products or services in high demand — but the dollar value is relatively low, it may not make sense to spend the money it would require to be in the top position.
Set Goals and Budget
You should identify what your goals are for your PPC campaign. How much traffic do you want? How much traffic can you afford. Again Google adwords offers some great tools to estimate traffic based on budget.
You can maximize your budget by writing compelling ads, choosing the right keywords and targeting the right market.
Write Compelling Targeted Ads
•Seems logical, right?
•Increase your qualified clicks by targeting your ad copy. (And Google’s position algorithm takes into account relevancy as well as maximum bid,: so if your copy entices clicks, you’ll climb the page while controlling expense.)
•The person searching who sees your advertising copy should know exactly what they can expect to see when they click on your ad: hopefully, they should reach the exact product they’re seeking.
•Relevance is the critical concept here: getting the copy (and the landing page) right can increase the economic efficiency of your campaigns by 10% or more.
Think Small and Think Hard
Advertisers familiar with the search engines’ character limits don’t have the luxury of writing long. The opportunity is to turn these limits to your advantage. Use them to hone your ability to craft copy that cuts to the chase.
If you’re responsible for your company’s advertising, can you describe why a prospect should buy from you – in three compelling phrases? Can your search provider explain what makes them special – in seventy characters or less?
Listen” and Repeat – ECHO
We’re all familiar with the idea that good communicators actively listen and then reflect back what they heard. The same principle applies when crafting advertising copy. Wherever relevant, “echo” the actual search phrase in your copy. Google rewards this “echoed” text with bolding, so to get the most out of this benefit, echo the key terms in the title.
is the method of determining the physical location of a website visitor and deliver different content to that visitor based his location, such as country, region/state, city, metro code/zip code, organization, ISP
Geo-targeting lets you target your ads to specific countries and languages. When you create a new AdWords campaign, you select the countries or regions and the language(s) for your ad. That campaign’s ads will appear only to users who live in the those areas and who have selected one of those languages as their preference.
In online advertising, a conversion occurs when a click on your ad leads directly to user behavior you deem valuable, such as a purchase, signup, page view, or lead. Google has developed a tool to measure these conversions, and ultimately, help you identify how effective your AdWords ads and keywords are for you.
It works by placing a cookie on a user’s computer or mobile phone when he or she clicks on one of your AdWords ads. Then, if the user reaches one of your conversion pages, the cookie is connected to your web page. When a match is made, Google records a successful conversion for you. Please note that the cookie Google adds to a user’s computer or mobile device when he or she clicks on an ad expires in approximately 30 days. This measure, and the fact that Google uses separate servers for conversion tracking and search results, protects the user’s privacy.
A / B Testing
A/B testing, or “split-run testing” as it is sometimes called, is one of the most pervasive and widely used methodologies behind web site improvement. And rightfully so – the concept is simple.
Say, for example, you want to test a new version of your Home Page. Well, direct some of your traffic to your current page (the A page) and some to your new page (the B page)…look at the differences in performance…and voila, you have your winner! Did the new page B outperform page A? If so, great – let’s direct all of our traffic to that page and watch our sales go through the roof. Or perhaps page B didn’t fare so well – so scrap it, and try again.