Link Pyramid

What is Link Pyramid?

A link pyramid is a very powerful and effective way of backlinking to your site.
Instead of having just a single link pointing to your site (or thousands of single links, which looks incredibly spammy), you not only make your linking look more natural, but you will also make your linking a whole lot more powerful.

How it work?



The above is just a simple diagram of what a link pyramid really is.
As you can see, the amount of tier 1 links pointing to your main site are very few. This is perfect because Google penalties target mass amounts of links built to your site, but by following the link pyramid strategy you avoid that penalty.

Tier 2 contains more links that point towards tier 1, making tier 1 links more powerful from the link juice they gain from tier 2. Try to keep the anchor text relevant to your money site niche.
Tier 3 links to tier 3 and is usually a layer that is considered pretty spammy. This layer often consists of blasting thousands of profile links, social media profile links, Classified submission, Social sharing and spammy links to tier 1.

Tier 1
10 Blog websites linking to the Main site with a main keyword and a secondary keyword.

Tier 2
5 other blogs websites containing 2 links, 1 linking to the main website and 1 linking to one of the tier 1 links.

Tier 2
5 other blogs websites containing 2 links, 1 linking to the main website and 1 linking to one of the tier 1 links.

Why You Should Choose Link Pyramid?

Link pyramids always have been and are still the most effective ways to do SEO & ORM. We are keeping track of all the tier 1 layers so that if Google changes its rules and your link pyramid becomes bad for your SEO, then you can simply change or remove the tier one links and thus keep your money sites safe. That’s why we choose Link Pyramid to promote our websites.

Digital and Social Media Strategies: Driving Organizational Performance


In the past, developing digital strategy and online relationship with customers was a competitive advantage: now it is a basic requirement for doing business. This programme will give you the practical guidance you need to transition from brick and mortar to click and mortar through powerful online engagement. The programme shows how to choose the best tools for your needs and develop a strategy oriented to your business goals. The discussion during the programme will help develop a social media process that is quantifiable, repeatable and improvable. New digital and social media platforms are coming up in the near future. Any strategy based on the concepts and methods in the programme can be adapted to take full advantage of them. The digital and social media strategy will be testable, controllable and fully integrated with broader goals and objectives of the company.

Contents
  • Evaluating how companies can satisfy customer needs and lower customer acquisition and retention  costs
  • Developing a transition plan from offline to online connect with the customers
  • Formulating an action plan for successful digital and social media policy
  • Allocating resources across different channels
  • Evaluating mobile platforms and their impact on marketing in the future
  • Organizational processes required to implement digital and social media strategies


Who Should Attend
Business leaders and managers with P & L responsibility, CMOs, product managers, marketing managers, online marketing and software development executives, consultants, digital and social media analysts.

 INDIAN INSTITUTE OF MANAGEMENT BANGALORE
  Bannerghatta Road, Bangalore 560 076
  Phone : +91 - 80 - 2699 3264/3475/3742
  E-mail : openpro@iimb.ernet.in / edp@iimb.ernet.in
  http://www.iimb.ac.in/EEP

 
From: http://www.iimb.ernet.in/executive-education/open-programmes/Inside_pages/strategy-Digital-and-Social-Media-Strategies-Driving-Organizational-Performance.htm?management=StrategyAndGeneralManagement&addurl=S00640&Ref=Linkedin

Google Reporting for Bad Links or Content

Everybody wants his/her website ranked well in Google SERPs. But there are a number of bad links and low quality content available on your website or third party website.

You have the access for your website to remove any bad link or content, but how will you remove bad link or content on third party website?

Yes, Google provide us this facility to remove bad quality links or content, You can simply report those website to Google and Google will remove those sites from it's search index.

Below are the useful Google Tools:

For Rich Snippet Type Queries:
https://support.google.com/webmasters/contact/rich_snippets_spam?hl=en

Remove Content Request on Third Party Website:
https://support.google.com/webmasters/answer/1663688

Remove content for legal reasons
https://support.google.com/legal/troubleshooter/1114905?product=websearch&rd=2#ts=1115655,1282900,1115974

Remove personal information
https://support.google.com/websearch/troubleshooter/3111061#ts=2889054

Remove content that’s not live
Remove Old version of your site cache:
If you want to remove the cache, tell Google how the page has changed. Type a word that no longer appears on the live page, but is still in the cached version.

https://www.google.com/webmasters/tools/removals?pli=1

Disavow links Tool:

Note: This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.

https://support.google.com/webmasters/answer/2648487?hl=en

https://www.google.com/webmasters/tools/disavow-links-main

Take Legal Issues on third party website having your site links or content bad or illegal:
https://support.google.com/legal/contact/lr_legalother?product=websearch

Remarketing Google Adwords

To use remarketing on the Google Display Network, you may want to choose "Display Network only - Remarketing" as your campaign type when you create your campaign

How remarketing works

Remarketing can help you reach people who have previously visited your website as they visit other sites on the Google Display Network or search on Google. Using remarketing, you can show these customers messages tailored to them based on which sections of your site they visited.

To start using remarketing, you need to add the remarketing tag, a small snippet of code that you get from AdWords, across all your site pages. Many sites have an identical footer for all pages, and this remarketing tag could be placed there.

Once you've added the remarketing tag to your site, you can create remarketing lists for any of your webpages. For example, you could create a remarketing list for visitors to your most popular product category. The remarketing tag tells AdWords to save visitors to your "Popular category list." When people visit that page, their cookie id is added to the remarketing list.

Then, you create an AdWords campaign with a specific message to show only to people on your "Popular category list" while they search on Google or browse other Display Network sites. Your remarketing messages won't be shown to people who are not on the list.
Because the remarketing tag is on all of the pages of your website, you can develop more detailed audiences. For example, you could create lists not only for your most popular product category, but also for each of your other product categories or your shopping cart.


How to Create a Remarketing List?

•    To run a remarketing campaign, you'll need a collection of cookies from people who visited your site, also called a remarketing list.
•    Your remarketing list can include people who visited a certain page on your site, and you can also create more advanced lists using templates and the rule         builder.
•    Go to the "Shared library" in AdWords to create remarketing lists. Once your list is created, you can apply the list to your remarketing campaigns.


About creating a list

If you want to run a remarketing campaign to show ads to people who have previously visited your site, you'll need a remarketing list. A remarketing list is a collection of cookies from people who visited your site. Creating the list is one of the most important steps in setting up a remarketing campaign because you'd use this list to target your ads. You can start creating lists at anytime, but they'll only start getting visitors after you've placed the remarketing tag on your site.
Create as many remarketing lists as you'd like. Once you've created lists, create new campaigns and add your lists to your ad groups.

•    To use remarketing on the Display Network, select "Display Network only - Remarketing" as your campaign type. The default settings in this campaign type can help you achieve better performance.

•    To use remarketing lists for search ads, select "Search Network only - All features" as your campaign type. Remarketing lists are not available for "Search Network only - Standard" or "Search & Display Networks - Standard" campaigns.

Note: Before you create your list, read about remarketing strategies and best practices to learn what kind of list best suits your goals.

Is Your Profile Varified on Facebook?



Some well-known public figures and Pages with large followings (ex: celebrities, journalists, government officials, popular brands and businesses) are verified by Facebook as having an authentic identity. You'll see a blue badge next to their names. Hover over the blue badge on a verified profile or Page to learn more.
We verify profiles or Pages to help you be sure that they are who they claim to be. Keep in mind that not all authentic profiles and Pages are verified and that you can't request to have your profile or Page verified. You can report fake accounts that are impersonating you, your business or your brand.
If your profile or Page isn't verified, there are other ways to help your followers or the people that like your Page know that your identity is authentic. For example, you can:
  • Link to your Facebook profile or Page from your official website
  • You can complete the About section of your profile or Page to provide more information.



Can I request that my profile or Page be verified? 

Facebook automatically verifies certain accounts to help people find authentic profiles and Pages. We're not currently able to accept requests for verified profiles or Pages from the public. If you see a fake account that's impersonating you, your business or your brand, please report it.

Is Webmaster Showing Zero Index Pages?

Today I am going to tell you something about Google Webmaster Tool. Most of the times we find No Data Available in In Google Webmaster tool.
Or Webmaster shows zero index pages in it's "Index Status" option. But it shows some indexed pages in its "Sitemap" option.

I will tell you why this is happening and how to solve it?

The most important thing to check that how you are adding your website url in webmaster.

1. If your website is live without www. like http://example.com/. means your site is indexed without www, In this matter you need to add your website in webmaster as http://example.com/

2. If your website is live with www. like http://www.example.com/, means your site is indexed with www. In this case you need to add your website with http://www.example.com/



How to Create a Robots.txt File ?

According to Google webmasters:

The simplest robots.txt file uses two rules:
  • User-agent: the robot the following rule applies to
  • Disallow: the URL you want to block
These two lines are considered a single entry in the file. You can include as many entries as you want. You can include multiple Disallow lines and multiple user-agents in one entry.
Each section in the robots.txt file is separate and does not build upon previous sections. For example:
User-agent: *
Disallow: /folder1/

User-Agent: Googlebot
Disallow: /folder2/
In this example only the URLs matching /folder2/ would be disallowed for Googlebot.

User-agents and bots

A user-agent is a specific search engine robot. The Web Robots Database lists many common bots. You can set an entry to apply to a specific bot (by listing the name) or you can set it to apply to all bots (by listing an asterisk). An entry that applies to all bots looks like this:
User-agent: *
Google uses several different bots (user-agents). The bot we use for our web search is Googlebot. Our other bots like Googlebot-Mobile and Googlebot-Image follow rules you set up for Googlebot, but you can set up specific rules for these specific bots as well.

Blocking user-agents

The Disallow line lists the pages you want to block. You can list a specific URL or a pattern. The entry should begin with a forward slash (/).
  • To block the entire site, use a forward slash.
    Disallow: /
  • To block a directory and everything in it, follow the directory name with a forward slash.
    Disallow: /junk-directory/
  • To block a page, list the page.
    Disallow: /private_file.html
  • To remove a specific image from Google Images, add the following:
    User-agent: Googlebot-Image
    Disallow: /images/dogs.jpg 
  • To remove all images on your site from Google Images:
    User-agent: Googlebot-Image
    Disallow: / 
  • To block files of a specific file type (for example, .gif), use the following:
    User-agent: Googlebot
    Disallow: /*.gif$
  • To prevent pages on your site from being crawled, while still displaying AdSense ads on those pages, disallow all bots other than Mediapartners-Google. This keeps the pages from appearing in search results, but allows the Mediapartners-Google robot to analyze the pages to determine the ads to show. The Mediapartners-Google robot doesn't share pages with the other Google user-agents. For example:
    User-agent: *
    Disallow: /
    
    User-agent: Mediapartners-Google
    Allow: /
Note that directives are case-sensitive. For instance, Disallow: /junk_file.asp would block http://www.example.com/junk_file.asp, but would allow http://www.example.com/Junk_file.asp. Googlebot will ignore white-space (in particular empty lines)and unknown directives in the robots.txt.
Googlebot supports submission of Sitemap files through the robots.txt file.

Pattern matching

Googlebot (but not all search engines) respects some pattern matching.
  • To match a sequence of characters, use an asterisk (*). For instance, to block access to all subdirectories that begin with private:
    User-agent: Googlebot
    Disallow: /private*/
  • To block access to all URLs that include a question mark (?) (more specifically, any URL that begins with your domain name, followed by any string, followed by a question mark, followed by any string):
    User-agent: Googlebot
    Disallow: /*?
  • To specify matching the end of a URL, use $. For instance, to block any URLs that end with .xls:
    User-agent: Googlebot 
    Disallow: /*.xls$
    You can use this pattern matching in combination with the Allow directive. For instance, if a ? indicates a session ID, you may want to exclude all URLs that contain them to ensure Googlebot doesn't crawl duplicate pages. But URLs that end with a ? may be the version of the page that you do want included. For this situation, you can set your robots.txt file as follows:
    User-agent: *
    Allow: /*?$
    Disallow: /*?
    The Disallow: / *? directive will block any URL that includes a ? (more specifically, it will block any URL that begins with your domain name, followed by any string, followed by a question mark, followed by any string).
    The Allow: /*?$ directive will allow any URL that ends in a ? (more specifically, it will allow any URL that begins with your domain name, followed by a string, followed by a ?, with no characters after the ?).
Save your robots.txt file by downloading the file or copying the contents to a text file and saving as robots.txt. Save the file to the highest-level directory of your site. The robots.txt file must reside in the root of the domain and must be named "robots.txt". A robots.txt file located in a subdirectory isn't valid, as bots only check for this file in the root of the domain. For instance, http://www.example.com/robots.txt is a valid location, but http://www.example.com/mysite/robots.txt is not.