Friday, May 31, 2013

SEO Book.com

SEO Book.com


Growing An SEO Business By Removing Constraints

Posted: 31 May 2013 12:15 AM PDT

If you run an SEO business, or any service business, you'll know how hard it can be to scale up operations. There are many constraints that need to be overcome in order to progress.

We'll take a look at a way to remove barriers to growth and optimize service provision using the Theory Of Constraints. This approach proposes a method to identify the key constraints to performance which hinder growth and expansion.

The Theory Of Constraints has been long been used for optimizing manufacturing.....

We had no legs to stand on to maintain our current customer base let alone acquire and keep new business. This was not an ideal position to be in, particularly in a down economy when we couldn't afford to have sales reduce further

... but more recently, it's been applied to services, too.

The results were striking. The number of days to decide food stamp eligibility dropped from 15 to 11; phone wait times were reduced from 23 minutes to nine minutes. Budgetary savings have exceeded the $9 million originally cut

It's one way of thinking about how to improve performance by focusing on bottlenecks. If you're experiencing problems such as being overworked and not having enough time, it could offer a solution.

First we'll take a look at the theory, then apply it to an SEO agency. It can be applied to any type of business, of course.

Theory Of Constraints

Any manageable system as being limited in achieving more of its goals by a very small number of constraints. There is always at least one constraint, and TOC uses a focusing process to identify the constraint and restructure the rest of the organization around it

If there weren't constraints, you could grow your business as large and as fast as you wanted.

You can probably think of numerous constraints that prevent you from growing your business. However, the theory holds that most constraints are really side issues, and that organizations are constrained by only one constraint at any one time.

A constraint is characterized as the "weakest link".

The weakest link holds everything else up. Once this constraint has been eliminated or managed, another "weakest link" may well emerge, so the process is repeated until the business is optimized. Constraints can be people, procedures, supplies, management, and systems.

In Dr. Eli Goldratt's book, "The Goal", Golddratt identifies the five steps to identify and address the constraint:

  • Identify the constraint
  • Exploit the constraint
  • Subordinate everything else to the constraint
  • Elevate the constraint
  • Go back to step 1
  • 1. Identify The Constraint

    What is the biggest bottleneck that holds back company performance? What activity always seems to fall behind schedule, or take the most time? This activity might not be the main activity of the company. It could be administrative. It could be managerial.

    If you're not sure, try the "Five Whys" technique to help determine the root cause:

    By repeatedly asking the question "Why" (five is a good rule of thumb), you can peel away the layers of symptoms which can lead to the root cause of a problem. Very often the ostensible reason for a problem will lead you to another question. Although this technique is called "5 Whys," you may find that you will need to ask the question fewer or more times than five before you find the issue related to a problem

    2. Exploit The Constraint

    Once the constraint is identified, you then utilize the constraint to its fullest i.e. you try to make sure that constraint is working at maximum performance. What is preventing the constraint from working at maximum performance?

    If the constraint is staff, you might look at ways for people to produce more work, perhaps by automating some of their workload, or allocating less-essential work to someone else. It could involve more training. It could involve adopting different processes.

    3. Subordinate Everything Else To The Constraint

    Identify all the non-constraints that may prevent the constraint from working at maximum performance. These might be activities or processes the constraint has to undertake but aren't directly related to the constraint.

    For example, a staff member who is identified as a constraint might have a billing task that could either by automated or allocated to someone else.

    The constraint should not be limited by anything outside their control. The constraint can't do any more than it possibly can i.e. if your constraint is time, you can't have someone work anymore than 24 hours in a day! More practically, 8 hours a day.

    Avoid focusing on non-constraints. Optimizing non-constraints might feel good, but they won't do much to affect overall productivity.

    4. Elevate The Constraint

    Improve productivity of the constraint by lifting the performance of the constraint. Once you've identified the constraint, and what is limiting performance, then you typically find spare capacity emerges. You then increase the workload. The productivity of the entire company is now lifted. Only then would you hire an additional person, if necessary.

    5. Repeat

    The final step is to repeat the process.

    The process is repeated because the weakest link may now move to another area of the business. For example, if more key workers have been hired to maximize throughput, then the constraint may have shifted to a management level, because the supervisory workload has increased.

    If so, this new constraint gets addressed via the same process.

    Applying The Theory Of Constraints To An SEO Agency

    Imagine Acme SEO Inc.

    Acme SEO are steadily growing their client base and have been meeting their clients demands. However, they've noticed projects are taking longer and longer to finish. They're reluctant to take on new work, as it appears they're operating at full capacity.

    When they sit down to look at the business in terms of constraints, they find that they're getting the work, they're starting the work on time, but the projects slow down after that point. They frequently rush to meet deadlines, or miss them. SEO staff appear overworked. If the agency can't get through more projects, then they can't grow. Everything else in the business, from the reception to sales, depends on it. Do they just hire more people?

    They apply the five steps to define the bottleneck and figure out ways to optimize performance.

    Step One

    Identify the constraint. What is the weakest link? What limits the SEO business doing more work? Is it the employees? Are they skilled enough? How about the systems they are using? Is there anything getting in the way of them completing their job?

    Try asking the Five Whys to get to the root of the problem:

    1. Why is this process taking so long? Because there is a lot of work involved.
    2. Why is there a lot of work involved? Because it's complex.
    3. Why is it complex? Because there is a lot of interaction with the client.
    4. Why is there a lot of interaction with the client? Because they keep changing their minds.
    5. Why do they keep changing their demands? Because they're not clear about what they want.

    Step Two

    Exploiting the constraint. How can the SEO work at maximum load?

    If an SEO isn't doing as much as they could be, is it due to project management and training issues? Do people need more direct management? More detailed processes? More training?

    It sounds a bit ruthless, especially when talking about people, but really it's about constructively dealing with the identified bottlenecks, as opposed to apportioning blame.

    In our example, the SEOs have the skills necessary, and work hard, but the clients kept changing scope, which is leading to a lot of rework and administrative overhead.

    Once that constraint had been identified, changes were made to project management, eliminating the potential for scope creep after the project had been signed off, thus helping maximize the throughput of the worker.

    Step Three

    Subordinate the constraint. So, the process has been identified as the cause of a constraint. By redesigning the process to control scope creep before the SEO starts, say at a sales level, they free up more time. When the SEO works on the project, they're not having to deal with administrative overhead that has a high time cost, therefore their utility is maximised.

    The SEO is now delivering more forward momentum.

    Step Four

    Elevate the performance of the constraint. They monitor the performance of the SEO. Does the SEO now have spare capacity? Is the throughput increasing? Have they done everything possible to maximize the output? Are there any other processes holding up the SEO? Should the SEO be handling billing when someone else could be doing that work? Is the SEO engaged in pre-sales when that work could be handled by sales people?

    Look for work being done that takes a long time, but doesn't contribute to output. Can these tasks be handed to someone else - someone who isn't a constraint?

    If the worker is working at maximum utility, then adding another worker might solve the bottleneck. Once the bottleneck is removed, performance improves.

    Adding bodies is the common way service based industry, like SEO, scales up. A consultancy bills hours, and the more bodies, the more hours they can ill. However, if the SEO role is optimized to start with, then they might find they have spare capacity opening up so don't need as many new hires.

    Step Five

    Repeat.

    Goldratt stressed that using the Theory Of Constraints to optimize business is an on-going task. You identify the constraint - which may not necessarily be the most important aspect of the business i.e. it could be office space - which then likely shifts the weakest link to another point. You then optimize that point, and so on. Fixing the bottleneck is just the beginning of a process.

    It's also about getting down to the root of the problem, which is why the Five Whys technique can be so useful. Eliminating a bottleneck sounds simple, and a quick fix, but the root of the problem might not be immediately obvious.

    In our example, it appeared as though the staff are the problem, so the root cause could be misdiagnosed as "we need more staff". In reality, the root cause of the bottleneck was a process problem.

    Likewise some problems aligned with an employee on a specific project might be tied to the specific client rather than anything internal to your company. Some people are never happy & will never be satisfied no matter what you do. Probably the best way to deal with people who are never satisfied is to end those engagements early before they have much of an impact on your business. The best way to avoid such relationships in the first place is to have some friction upfront so that those who contact you are serious about working with you. It can also be beneficial to have some of your own internal sites to fall back on, such that when consulting inquiries are light you do not chase revenue at the expense of lower margins from clients who are not a good fit. These internal projects also give you flexibility to deal with large updates by being able to push some of your sites off into the background while putting out any fires that emerge from the update. And those sorts of sites give you a testing platform to further inform your strategy with client sites.

    How have you addressed business optimization problems? What techniques have you found useful, and how well did they work?

    Further Resources:

    I've skimmed across the surface, but there's a lot more to it. Here's some references used in the article, and further reading...

    Categories: 

    Thursday, May 30, 2013

    SEO Book.com

    SEO Book.com


    Advanced Web Ranking Review - Website Auditor

    Posted: 30 May 2013 02:37 AM PDT

    Advanced Web Ranking (AWR) is one of my favorite pieces of SEO software on the market today. It has been indispensable to me over the years. The software does it all and then some.

    I reviewed it a few years ago; you can read that here, most of it is still relevant and I'll be updating it in the near future. In this post I want to highlight their Website Auditor tool.

    Combining On and Off Page Factors

    The beauty of this feature is the simple integration of on and off-page elements. There are other tools on the market that focus solely on the on-page stuff (and do a fantastic job of it) and AWR does as well.

    The all-in-one nature of Advanced Web Ranking allows you to deftly move between the on and off (links, social, etc) page factors for a site (and its competition) inside of the Website Auditor feature. AWR has other tools built-in to go even deeper on competitive analysis as well.

    A quick FYI on some general settings and features:

    • You can crawl up to 10,000 pages on-demand
    • All results are exportable
    • Audits are saved so you can look at historical data trends
    • Complete white-label reporting is available
    • Because it's software it's all you can eat :) (save for the page limit)

    You can also set the tool to crawl only certain sections of a site as well as completely ignore certain sections or parameters so you can make the best use of your 10,000 page-crawl limit. This is a nice way to crawl a specific section of a site to find the most "social" content (limit the crawl to /blog as an example).

    Interface Overview

    Here's what the initial interface looks like:

    awr-site-audit-interface-overview

    It's a thick tool for sure, on the whole, but just focus on the Auditor piece. It's fairly self-explanatory but the top toolbar (left to right) shows:

    • Current site being viewed
    • Update date history for historical comparison
    • Filtering options (all pages, only specific pages (200's, 404's, missing title tags, basically all the data points are available for slicing and dicing)
    • Button for on-page issues to show in the view area
    • Button for page-level external link data to show in the view area
    • Button for page-level social metrics (Twitter, Facebook, G+) to show in the view area
    • Update Project button (to update the Audit :D )
    • Text box where you can filter the results manually
    • Auditor settings (see below)
    • Link data source, Open Site Explorer for now (Majestic is available in other areas of AWR and I'm told it will be available in Website Auditor as another option on the next release, 9.6 (due out very soon)

    The tool settings button allows to configure many areas of the Auditor tool to help get the exact data you want:

    awr-site-audit-tool-settings

    On-Page and Off-Page Data Points

    The on-page overview gives you all of what is listed in the viewport shown previously and if you click on the Filter icon you'll be able to look at whatever piece of on-page data you'd like to:

    awr-site-audit-page-filters

    I did just a short crawl here in order to show you how your data will look inside the tool. The view of the initial on-page report shows your traditional items such as:

    • Title tag info
    • Meta descriptions
    • Duplicate content
    • Robots and indexing information
    • Broken link and external link counts
    • Levels deep from the root
    • HTTP Status Code

    Each page can be clicked on to show specific information about that page:

    • Links from the page to other sites
    • Internal links to the page
    • Broken links
    • External links pointing into the page with anchor text data, Page Authority, and MozRank. Also whether the link is no-follow or an image will be shown as well
    • Broken link and external link counts
    • Levels deep from the root
    • HTTP Status Code

    The on-page overview is also referred to as the Issues Layout:

    awr-site-audit-on-page-view

    The other 2 views are more of a mix of on-page and off-page factors.

    The Links Layout shows the following (for the root domain and for the sub-pages individually):

    • Levels deep from the homepage
    • Page Authority
    • MozRank
    • Linking Root Domains
    • Total Inbound Links
    • Outbound Links
    • No-follows
    • Inbound and Outbound Internal Links

    awr-audit-links-overview

    In this view you can click on any of the crawled pages and see links to the page internally and externally as well as broken links.

    The Social Layout shows the following information:

    • Facebook Shares, Twitter Shares, and Google +1's for a given URL
    • Internal and external links to the page
    • Indexed or not
    • HTTP Status
    • Meta information
    • Broken Links

    awr-audit-social-layot

    This data is helpful in finding content ideas, competitor's content/social strategy, and for finding possible influencers to target in a link building/social awareness campaign for your site.

    Reporting and Scheduling

    Currently you can provide white label PDF/interactive HTML reports for the following:

    • Issues Layout
    • Link Layout
    • Social Layout

    You can also do a quick export from the viewport window inside the Website Auditor tab to get either an HTML/PDF/CSV export of the data you are looking at (list of link issues, social stats, on-page issues, and so on).

    Reports can be scheduled to run automatically so long as the computer AWR resides on is on and functional. You could also remote in with a service like LogMeIn to run an update remotely or use the AWR server plan where you host the AWR application on one machine and remote client machines (staff as an example) can connect to the shared database and make an update or run a report if needed.

    Advanced Web Ranking's Website Auditor is one of the most robust audit tools on the market and soon it will have integration with Majestic SEO (currently it ties into OpenSiteExplorer/Linkscape). It already pulls in social metrics from Twitter, Facebook, and G+ to give you a more comprehensive view of your site and your content.

    If you conduct technical audits or do competitive analysis you should give AWR a try, I think you'll like it :)

    Categories: 

    Wednesday, May 29, 2013

    SEO Book.com

    SEO Book.com


    LarryWorld

    Posted: 29 May 2013 06:58 AM PDT

    It's hard to disagree with Larry Page.

    In his recent speech at Google I/O, Page talked about privacy and how it impairs Google. "Why are people so focused on keeping their medical history private"? If only people would share more, then Google could do more.

    Well, quite.

    We look forward to Google taking the lead in this area and opening up their systems to public inspection. Perhaps they could start with the search algorithms. If Google would share more, publishers could do more.

    What's not to like? :)

    But perhaps that's comparing apples with oranges. The two areas may not be directly comparable as the consequences of opening up the algorithm would likely destroy Google's value. Google's argument against doing so has been that the results would suffer quality issues.

    Google would not win.

    TechnoUtopia

    If Page's vision sounds somewhat utopian, then perhaps we should consider where Google came from.

    In a paper entitled "The Politics Of Search: A Decade Retrospective", Laura Granker points out that when Google started out, the web was a more utopian place.

    A decade ago, the Internet was frequently viewed through a utopian lens, with scholars redicting that this increased ability to share, access, and produce content would reduce barriers to information access...Underlying most of this work is a desire to prevent online information from merely mimicking the power structure of the conglomerates that dominate the media landscape. The search engine, subsequently, is seen as an idealized vehicle that can differentiate the Web from the consolidation that has plagued ownership and content in traditional print and broadcast media

    At the time, researchers Introna and Nissenbaum felt that online information was too important to be shaped by market forces alone. They correctly predicted this would lead to a loss of information quality, and a lack of diversity, as information would pander to popular tastes.

    They advocated, perhaps somewhat naively in retrospect, public oversight of search engines and algorithm transparency to correct these weaknesses. They argued that doing so would empower site owners and users.

    Fast forward to 2013, and there is now more skepticism about such utopian values. Search engines are seen as the gatekeepers of information, yet they remain secretive about how they determine what information we see. Sure, they talk about their editorial process in general terms, but the details of the algorithms remain a closely guarded secret.

    In the past decade, we've seen a considerable shift in power away from publishers and towards the owners of big data aggregators, like Google. Information publishers are expected to be transparent - so that a crawler can easily gather information, or a social network can be, well, social - and this has has advantaged Google and Facebook. It would be hard to run a search engine or a social network if publishers didn't buy into this utopian vision of transparency.

    Yet, Google aren't quite as transparent with their own operation. If you own a siren server, then you want other people to share and be open. But the same rule doesn't apply to the siren server owner.

    Opening Up Health

    Larry is concerned about constraints in healthcare, particularly around access to private data.

    "Why are people so focused on keeping their medical history private?" Page thinks it's because people are worried about their insurance. This wouldn't happen if there was universal care, he reasons.

    I don't think that's correct.

    People who live in areas where there is universal healthcare, like the UK, Australia and New Zealand, are still very concerned about the privacy of their data. People are concerned that their information might be used against them, not just by insurance companies, but by any company, not to mention government agencies and their employees.

    People just don't like the idea of surveillance, and they especially don't like the idea of surveillance by advertising companies who operate inscrutable black boxes.

    Not that good can't come from crunching the big data linked to health. Page is correct in saying there is a lot of opportunity to do good by applying technology to the health sector. But first companies like Google need to be a lot more transparent about their own data collection and usage in order to earn trust. What data are they collecting? Why? What is it used for? How long is it kept? Who can access it? What protections are in place? Who is watching the watchers?

    Google goes someway towards providing transparency with their privacy policy. A lesser known facility, called Data Liberation allows you to move data out of Google, if you wish.

    I'd argue that in order for people to trust Google to a level Page demands would require a lot more rigor and transparency, including third party audit. There are also considerable issues to overcome, in terms of government legislation, such as privacy acts. Perhaps the most important question is "how does this shift power balances"? No turkey votes for an early Christmas. If your job relies on being a gatekeeper of health information, you're hardly going to hand that responsibility over to Google.

    So, it's not a technology problem. And not just because people afraid of insurance companies. And it's not because people aren't on board with the whole Burning-Man-TechnoUtopia vision. It's to do with trust. People would like to know what they're giving up, to whom, and what they're getting in return. And it's about power and money.

    Page has answered some of the question, but not nearly enough of it. Something might be good for Google, and it might be good for others, but people want a lot more than just his word on it.

    Sean Gallagher writes in ArsTechnica:

    The changes Page wants require more than money. They require a change of culture, both political and national. The massively optimistic view that technology can solve all of what ails America—and the accompanying ideas on immigration, patent reform, and privacy—are not going to be so easy to force into the brains of the masses.

    The biggest reason is trust. Most people trust the government because it's the government—a 226-year old institution that behaves relatively predictably, remains accountable to its citizens, and is governed by source code (the Constitution) that is hard to change. Google, on the other hand, is a 15-year old institution that is constantly shifting in nature, is accountable to its stockholders, and is governed by source code that is updated daily. You can call your Congressman and watch what happens in Washington on C-SPAN every day. Google is, to most people, a black box that turns searches and personal data into cash"

    And it may do so at their expense, not benefit.

    Categories: 

    Tuesday, May 28, 2013

    SEO Book.com

    SEO Book.com


    GoogleMart

    Posted: 27 May 2013 04:29 PM PDT

    It was hard to spot, at first.

    It started with one store on the outskirts of town. It was big. Monolithic. It amalgamated a lot of cheap, largely imported stuff and sold the stuff on. The workers were paid very little. The suppliers were squeezed tight on their margins.

    And so it grew.

    And as it grew, it hollowed out the high street. The high street could not compete with the monoliths sheer power. They couldn't compete with the monoliths influence on markets. They couldn't compete with the monoliths unique insights gained from clever number crunching of big data sets.

    I'm talking about Wal Mart, of course.

    Love 'em or loathe 'em, Walmart gave people what they wanted, but in so doing, hollowed out a chunk of America's middle class. It displaced a lot of shop keepers. It displaced small business owners on Main Street. It displaced the small family retail chain that provided a nice little middle class steady earner.

    Where did all those people go?

    It was not only the small, independent retail businesses and local manufacturers who were fewer in number. Their closure triggered flow-on effects. There was less demand for the services they used, such as local small business accountants, the local lawyer, small advertising companies, local finance companies, and the host of service providers that make up the middle class ecosystem.

    Where did they all go?

    Some would have taken up jobs at WalMart, of course. Some would become unemployed. Some would close their doors and take early retirement. Some would change occupations and some would move away to where prospects were better.

    What does any of this have to do with the internet?

    The same thing is happening on the internet.

    And if you're a small business owner, located on the web-equivalent of the high street, or your business relies on those same small business owners, then this post is for you.

    Is Technology Gutting The Middle Class?

    I've just read "Who Owns The Future", by Jaron Lanier. Anyone who has anything to do with the internet - and anyone who is even remotely middle class - will find it asks some pretty compelling questions about our present and future.

    Consider this.

    At the height of it's power, the photography company Kodak employed more than 140,000 people and wa worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When it was sold to Facebook for a billion dollars in 2012, Instagram only employed 13 people

    Great for Instagram. Bad for Kodak. And bad for the people who worked for Kodak. But, hey. That's progress, right? Kodak had an outdated business model. Technology overtook them.

    That's true. It is progress. It's also true that all actions have consequences. The consequence of transformative technology is that, according to Lanier, it may well end up destroying the middle class if too much of the value is retained in the large technology companies.

    Lanier suggests that the advance of technology is not replacing as many jobs as it destroys, and those jobs that are destroyed are increasingly middle class.

    Not Political (Kinda)

    I don't wish to make this post political, although all change is inherently political. I'm not taking political sides. This issue cuts across political boundaries. I have a lot of sympathy for technological utopian ideas and the benefits technology brings, and have little time for luddism.

    However, it's interesting to focus on the the consequences of this shift in wealth and power brought about by technology and whether enough people in the internet value chain receive adequate value for their efforts.

    If the value doesn't flow through, as capitalism requires in order to function well, then few people win. Are children living at home longer than they used to? Are people working longer hours than they used to in order to have the same amount of stuff? Has the value chain been broken, Lanier asks? And, if so, what can be done to fix it?

    What Made Instagram Worth One Billion Dollars?

    Lanier points out that Instagram wasn't worth a billion dollars because it had extraordinary employees doing amazing things.

    The value of Instagram came from network effects.

    Millions of people using Instagram gave the Instagram network value. Without the user base, Instagram is just another photo app.

    Who got paid in the end? Not the people who gave the network value. The people who got paid were the small group at the top who organized the network. The owners of the "Siren Servers":

    The power rests in what Lanier calls the "Siren Servers": giant corporate repositories of information about our lives that we have given freely and often without consent, now being used for huge financial benefit by a super-rich few

    The value is created by all the people who make up the network, but they only receive a small slither of that value in the form of a digital processing tool. To really benefit, you have to own, or get close to, a Siren Server.

    Likewise, most of Google's value resides in the network of users. These users feed value into Google simply by using it and thereby provide Google with a constant stream of data. This makes Google valuable. There isn't much difference between Google and Bing in terms of service offering, but one is infinitely more valuable than the other purely by virtue of the size of the audience. Same goes for Facebook over Orkut.

    You Provide Value

    Google are provided raw materials by people. Web publishers allow Google to take their work, at no charge, and for Google to use that work and add value to Google's network. Google then charges advertisers to place their advertising next to the aggregated information.

    Why do web publishers do this?

    Publishers create and give away their work in the hope they'll get traffic back, from which they may derive benefit. Some publishers make money, so they can then pay real-world expenses, like housing, food and clothing. The majority of internet publishers make little, or nothing, from this informal deal. A few publishers make a lot. The long tail, when it comes to internet publishing, is pretty long. The majority of wealth, and power, is centralized at the head.

    Similarly, Google's users are giving away their personal information.

    Every time someone uses Google, they are giving Google personal information of value. Their search queries. They browsing patterns. Their email conversations. Their personal network of contacts. Aggregate that information together, and it becomes valuable information, indeed. Google records this information, crunches it looking for patterns, then packages it up and sells it to advertisers.

    What does Google give back in return?

    Web services.

    Is it a fair exchange of value?

    Lanier argues it isn't. What's more, it's an exchange of value so one-sided that it's likely to destroy the very ecosystem on which companies like Google are based - the work output, and the spending choices, of the middle class. If few of the people who publish can make a reasonable living doing so, then the quality of what gets published must decrease, or cease to exist.

    People could make their money in other ways, including offline. However, consider that the web is affecting a lot of offline business, already. The music industry is a faint shadow of what it once was, even as recent as one decade ago. There are a lot fewer middle class careers in the music industry now. Small retailers are losing out to the web. Fewer jobs there. The news industry is barely making any money. Same goes for book publishers. All these industries are struggling as online aggregators carve up their value chains.

    Now, factor in all the support industries of these verticals. Then think about all the industries likely to be affected in the near future - like health, or libraries, or education, for example. Many businesses that used to hire predominantly middle class people are going out of business, downsizing their operations, or soon to have chunks of their value displaced.

    It's not Google's aim to gut the middle class, of course. This post is not an anti-Google rant, either, simply a look at action and consequence. What is the effect of technology and, in particular, the effect of big technology companies on the web, most of whom seem obsessed with keeping you in their private, proprietary environments for as long as possible?

    Google's aim is index all the worlds information and make it available. That's a good aim. It's a useful, free service. But Lanier argues that gutting the middle class is a side-effect of re-contextualising, and thereby devaluing, information. Information may want to be free, but the consequence of free information is that those creating the information may not get paid. Many of those who do get paid may be weaker organizations more willing to sacrifice editorial quality in able to stay in business. We already see major news sites with MFA-styled formatting on unvetted syndicated press releases. What next?

    You may notice that everyone is encouraged to "share" - meaning "give away" - but sharing doesn't seem to extend to the big tech companies, themselves.

    They charge per click.

    Robots.txt

    One argument is that if someone doesn't like Google, or any search engine, they should simply block that search engine via robots.txt. The problem with that argument is it's like saying if you don't like aspects of your city, you should move to the middle of the wilderness. You could, but really you'd just like to make the city a better place to be, and to see it thrive and prosper, and be able to thrive within it.

    Google provides useful things. I use Google, just like I use my iPhone. I know the deal. I get the utility in exchange for information, and this exchange is largely on their terms. What Lanier proposes is a solution that not only benefits the individual, and the little guy, but ultimately the big information companies, themselves.

    Money Go Round

    Technology improvements have created much prosperity and the development of a strong middle class. But the big difference today is that what is being commoditized is information itself. In a world increasingly controlled by software that acts as our interface to information, if we commoditize information then we commoditize everything else.

    If those creating the information don't get paid, quality must decrease, or become less available than it otherwise would be. They can buy less stuff in the real world. If they can't buy as much stuff in the real world, then Google and Facebook's advertisers have fewer people to talk to that they otherwise would.

    It was all a social construct to begin with, so what changed, to get to your question, is that at the turn of the [21st] century it was really Sergey Brin at Google who just had the thought of, well, if we give away all the information services, but we make money from advertising, we can make information free and still have capitalism. But the problem with that is it reneges on the social contract where people still participate in the formal economy. And it's a kind of capitalism that's totally self-defeating because it's so narrow. It's a winner-take-all capitalism that's not sustaining

    That isn't a sustainable situation long-term. A winner-takes-all system centralizes wealth and power at the top, whilst everyone else occupies the long tail. Google has deals in place with large publishers, such as AP, AFP and various European agencies, but this doesn't extend to smaller publishers. It's the same in sports. The very top get paid ridiculous amounts of money whilst those only a few levels down are unlikely to make rent on their earnings.

    But doesn't technology create new jobs? People who were employed at Kodak just go do something else?

    The latest waves of high tech innovation have not created jobs like the old ones did. Iconic new ventures like Facebook employ vastly fewer people than big older companies like, say, General Motors. Put another way, the new schemes.....channel much of the productivity of ordinary people into an informal economy of barter and reputation, while concentrating the extracted old -fashioned wealth for themselves. All activity that takes place over digital networks becomes subject to arbitrage, in the sense that risk is routed to whoever suffers lesser computation resources

    The people who will do well in such an environment will likely be employees of those who own the big data networks, like Google. Or they will be the entrepreneurial and adaptable types who manage to get close to them - the companies that serve WalMart or Google, or Facebook, or large financial institutions, or leverage off them - but Lanier argues there simply aren't enough of those roles to sustain society in a way that gave rise to these companies in the first place.

    He argues this situation disenfranchises too many people, too quickly. And when that happens, the costs spread to everyone, including the successful owners of the networks. They become poorer than they would otherwise be by not returning enough of the value that enables the very information they need to thrive. Or another way of looking at it - who's going to buy all the stuff if only a few people have the money?

    The network, whether it be a search engine, a social network, an insurance company, or an investment fund uses information to concentrate power. Lanier argues they are all they same as they operate in pretty much the same way. The use network effects to mine and crunch big data, and this, in turn, grows their position at the expense of smaller competitors, and the ecosystem that surrounds them.

    It doesn't really matter what the intent was. The result is that the technology can prevent the middle class from prospering and when that happens, everyone ultimately loses.

    So What Does He Propose Can Be Done?

    A few days ago, Matt Cutts released a video about what site owners can expect from the next round of Google changes.

    Google have announced a web spam change, called Penguin 2.0. They'll be "looking at" advertorials, and native advertising. They'll be taking a "stronger line" on this form of publishing. They'll also be "going upstream" to make link spammers less effective.

    Of course, whenever Google release these videos, the webmaster community goes nuts. Google will be making changes, and these changes may either make your day, or send you to the wall.

    The most interesting aspect of this, I think, is the power relationship. If you want to do well in Google's search results then there is no room for negotiation. You either do what they want or you lose out. Or you may do what they want and still lose out. Does the wealth and power sit with the publisher?

    Nope.

    In other news, Google just zapped another link network.

    Cutts warns they'll be going after a lot of this happening. Does wealth and power sit with the link buyer or seller?

    Nope.

    Now, Google are right to eliminate or devalue sites that they feel devalues their search engine. Google have made search work. Search was all but dead twelve years ago due to the ease with which publishers could manipulate the results, typically with off-topic junk. The spoils of solving this problem have flowed to Google.

    The question is has too much wealth flowed to companies like Google, and is this situation going to kill off large chunks of the ecosystem on which it was built? Google isn't just a player in this game, they're so pervasive they may as well be the central planner. Cutts is running product quality control. The customers aren't the publishers, they're the advertisers.

    It's also interesting to note what these videos do not say. Cutts video was not about how your business could be more prosperous. It was all about your business doing what Google wants in order for Google to be more prosperous. It's irrelevant if you disagree or not, as you don't get to dictate terms to Google.

    That's the deal.

    Google's concern lies not with webmasters just as WalMarts concern lies not with small town retailers. Their concern is to meet company goals and enhance shareholder value. The effects aren't Google or WalMarts fault. They are just that - effects.

    The effect of Google pursuing those objectives might be to gouge out the value of publishing, and in so doing, gouge out a lot of the value of the middle class. The Google self-drive cars project is fascinating from a technical point of view - the view Google tends to focus on - but perhaps even more fascinating when looked at from a position they seldom seem to consider, at least, not in public, namely what happens to all those taxi drivers, and delivery drivers, who get their first break in society doing this work? Typically, these people are immigrants. Typically, they are poor but upwardly mobile.

    That societal effect doesn't appear to be Google's concern.

    So who's concern should it be?

    Well, perhaps it really should be Google's concern, as it's in their own long-term best interest:

    Today, a guitar manufacturer might advertise through Google. But when guitars are someday spun out of 3D printers, there will be no one to buy an ad if guitar designs are "free". Yet Google's lifeblood is information put online for free. That is what Google's servers organize. Thus Google's current business model is a trap in the longterm

    Laniers suggestion is everyone gets paid, via micro-payments, linked back to the value they helped create. These payments continue so long as people are using their stuff, be it a line of code, a photograph, a piece of music, or an article.

    For example, if you wrote a blog post, and someone quoted a paragraph of it, you would receive a tiny payment. The more often you're quoted, the more relevant you are, therefore the more payment you receive. If a search engine indexes your pages, then you receive a micro-payment in return. If people view your pages, you receive a micro-payment. Likewise, when you consume, you pay. If you conduct a search, then you run Google's code, and Google gets paid. The payments are tiny, as far as the individual is concerned, but they all add up.

    Mammoth technical issues of doing this aside, the effect would be to take money from the head and pump it back into the tail. It would be harder to build empires off the previously free content others produce. It would send money back to producers.

    It also eliminates the piracy question. Producers would want people to copy, remix and redistribute their content, as the more people that use it, the more money they make. Also, with the integration of two-way linking, the mechanism Lanier proposes to keep track of ownership and credit, you'd always know who is using your content.

    Information would no longer be free. It would be affordable, in the broadest sense of the word. There would also be a mechanism to reward the production, and a mechanism to reward the most relevant information the most. The more you contribute to the net, and the more people use it, the more you make. Tiny payments. Incremental. Ongoing.

    Interesting Questions

    So, if these questions are of interest to you, I'd encourage you to read "Who Owns The Future" by Jaron Lanier. It's often rambling - in a good way - and heads off on wild tangents - in a good way, and you can tell there is a very intelligent and thoughtful guy behind it all. He's asking some pretty big, relevant questions. His answers are sketches that should be challenged, argued, debated and enlarged.

    And if big tech companies want a challenge that really will change the world, perhaps they could direct all that intellect, wealth and power towards enriching the ecosystem at a pace faster than they potentially gouge it.

    Categories: 

    Wednesday, May 15, 2013

    SEO Book.com

    SEO Book.com


    Why the Yahoo! Search Revenue Gap Won't Close

    Posted: 14 May 2013 05:51 AM PDT

    In spite of Yahoo! accepting revenue guarantees for another year from Microsoft, recently there has been speculation that Yahoo! might want to get out of their search ad deal with Microsoft. I am uncertain if the back channeled story is used as leverage to secure ongoing minimum revenue agreements, or if Yahoo! is trying to set the pretext narrative to later be able to push through a Google deal that might otherwise get blocked by regulators.

    When mentioning Yahoo!'s relative under-performance on search, it would be helpful to point out the absurd amount of their "search" traffic from the golden years that was various forms of arbitrage. Part of the reason (likely the primary reason) Yahoo! took such a sharp nose dive in terms of search revenues (from $551 million per quarter to as low as $357 million per quarter) was that Microsoft used quality scores to price down the non-search arbitrage traffic streams & a lot of that incremental "search" volume Yahoo! had went away.

    There were all sorts of issues in place that are rarely discussed. Exit traffic, unclosible windows, forcing numerous clicks, iframes in email spam, raw bot clicks, etc. ... and some of this was tied to valuable keyword lists or specific juicy keywords. I am not saying that Google has outright avoided all arbitrage (Ask does boatloads of it in paid + organic & Google at one point tested doing some themselves on credit cards keywords) but it has generally been a sideshow at Google, whereas it was the main attraction at Yahoo!.

    And that is what drove down Yahoo!'s click prices.

    Yahoo! went from almost an "anything goes" approach to their ad feed syndication, to the point where they made a single syndication partner Cyberplex's Tsavo Media pay them $4.8 million for low quality traffic. There were a number of other clawbacks that were not made public.

    Given that we are talking $4.8 million for a single partner & this alleged overall revenue gap between Google AdWords & Bing Ads is somewhere in the $100 million or so range, these traffic quality issues & Microsoft cleaning up the whoring of the ad feed that Yahoo! partners were doing is a big deal. It had a big enough impact that it caused some of the biggest domain portfolios to shift from Yahoo! to Google. I am a bit surprised to see it so rarely mentioned in these discussions.

    Few appreciate how absurd the abuses were. For years Yahoo! not only required you to buy syndication (they didn't have a Yahoo!-only targeting option until 2010 & that only came about as a result of a lawsuit) but even when you blocked a scammy source of traffic, if that scammy source was redirecting through another URL you would have no way of blocking the actual source, as mentioned by Sean Turner:

    To break it down, yahoo gives you a feed for seobook.com & you give me a feed for turner.com. But all links that are clicked on turner.com redirect through seobook.com so that it shows up in customer logs as seobook.com If you block seobook.com, it will block ads from seobook.com, but not turner.com. The blocked domain tool works on what domains display, not on where the feed is redirected through. So if you are a customer, there is no way to know that turner.com is sending traffic (since it's redirecting through seobook.com) and no way to block it through seobook.com since that tool only works on the domain that is actually displaying it.

    I found it because we kept getting traffic from gogogo.com. We had blocked it over and over and couldn't figure out why they kept sending us traffic. We couldn't find our ad on their site. I went to live.com and ran a site:gogogo.com search and found that it indexed some of those landing pages that use gogogo.com as a monetization service.

    The other thing that isn't mentioned is the longterm impact of a Yahoo! tie up with Google. Microsoft pays Yahoo! an 88% revenue share (and further guarantees on top of that), provides the organic listings free, manages all the technology, and allows Yahoo! to insert their own ads in the organic results.

    If Bing were to exit the online ad market, maybe Yahoo! could make an extra $100 million in the first year of an ad deal with Google, but if there is little to no competition a few years down the road, then when it comes time for Yahoo! to negotiate revenue share rates with Google, you know Google would cram down a bigger rake.

    This isn't blind speculation or theory, but aligned with Google's current practices. Look no further than Google's current practices with YouTube, where "partners" are paid different rates & are forbidden to mention their rates publicly: "The Partner Program forbids participants to reveal specifics about their ad-share revenue."

    Transparency is a one way street.

    Google further dips into leveraging that "home team always wins" mode of negotiating rates by directly investing in some of the aggregators/networks which offer sketchy confidential contracts < ahref="http://obviouslybenhughes.com/post/13933948148/before-you-sign-that-machinima-contract-updated">soaking the original content creators.:

    As I said, the three images were posted on yfrog. They were screenshots of an apparently confidential conversation had between MrWonAnother and a partner support representative from Machinima, in which the representative explained that the partner was locked indefinitely into being a Machinima partner for the rest of eternity, as per signed contract. I found this relevant, informative and honestly shocking information and decided to repost the images to obviouslybenhughes.com in hopes that more people would become aware of the darker side of YouTube partnership networks.

    Negotiating with a monopoly that controls the supply chain isn't often a winning proposition over the long run.

    Competition (or at least the credible risk of it) is required to shift the balance of power.

    The flip side of the above situation - where competition does help market participants to get a better revenue share - can be seen in the performance of AOL in their ad negotiation in 2005. AOL's credible threat to switched to Microsoft had Google invest a billion Dollars into AOL, where Google later had to write down $726 million of that investment. If there was no competition from Microsoft, AOL wouldn't have received that $726 million (and likely would have had a lower revenue sharing rate and missed out on some of the promotional AdWords credits they received).

    The same sort of "shifted balance of power" was seen in the Mozilla search renewal with Google, where Google paid Mozilla 3X as much due to a strong bid from Microsoft.

    The iPad search results are becoming more like phone search results, where ads dominate the interface & a single organic result is above the fold. And Google pushed their "ehnanced" ad campaigns to try to push advertisers into paying higher ad rates on those clicks. It would be a boon for Google if they can force advertisers to pay the same CPC as desktop & couple it with that high mobile ad CTR.

    Google owning Chrome + Android & doing deals with Apple + Mozilla means that it will be hard for either Microsoft or Yahoo! to substantially grow search marketshare. But if they partner with Google it will be a short term lift in revenues and dark clouds on the horizon.

    I am not claiming that Microsoft is great for Yahoo!, or that they are somehow far better than Google, only that Yahoo! is in a far better position when they have multiple entities competing for their business (as highlighted in the above Mozilla & AOL examples).

    Tuesday, May 14, 2013

    SEO Book.com

    SEO Book.com


    Link Madness

    Posted: 14 May 2013 05:24 AM PDT

    Link paranoia is off the scale. As the "unnatural link notifications" fly, the normally jittery SEO industry has moved deep into new territory, of late.

    I have started to wonder if some of these links (there are hundreds since the site is large) may be hurting my site in the Google Algo. I am considering changing most of my outbound links to rel="nofollow". It is not something I want to do but ... "

    We've got site owners falling to their knees, confessing to be link spammers, and begging for forgiveness. Even when they do, many sites don't return. Some sites have been returned, but their rankings, and traffic, haven't recovered. Many sites carry similar links, but get a free pass.

    That's the downside of letting Google dictate the game, I guess.

    Link Removal

    When site owners are being told by Google that their linking is "a problem," they tend to hit the forums and spread the message, so the effect is multiplied.

    Why does Google bother with the charade of "unnatural link notifications," anyway?

    If Google has found a problem with links to a site, then they can simply ignore or discount them, rather than send reports prompting webmasters to remove them. Alternatively, they could send a report saying they've already discounted them.

    So one assumes Google's strategy is a PR - as in public relations - exercise to plant a bomb between link buyers and link sellers. Why do that? Well, a link is a link, and one could conclude that Google must still have problems nailing down the links they don't like.

    So they get some help.

    The disavow links tool, combined with a re-inclusion request, is pretty clever. If you wanted a way to get site owners to admit to being link buyers, and to point out the places from which they buy links, or you want to build a database of low quality links, for no money down, then you could hardly imagine a better system of outsourced labour.

    If you're a site owner, getting hundreds, if not thousands, of links removed is hardly straightforward. It's difficult, takes a long time, and is ultimately futile.

    Many site owners inundated with link removal requests have moved to charging removal fees, which in many cases is understandable, given it takes some time and effort to verify the true owner of a link, locate the link, remove it, and update the site.

    As one rather fed-up sounding directory owner put it:

    Blackmail? Google's blackmailing you, not some company you paid to be listed forever. And here's a newsflash for you. If you ask me to do work, then I demand to be paid. If the work's not worth anything to you, then screw off and quit emailing me asking me to do it for free.

    Find your link, remove it, confirm it's removed, email you a confirmation, that's 5 minutes. And that's $29US. Don't like it? Then don't email me. I have no obligation to work for you for free, not even for a minute. …. I think the best email I got was someone telling me that $29 was extortion. I then had to explain that $29 wasn't extortion - but his new price of $109 to have the link removed, see, now THAT'S extortion.

    if it makes you sleep at night, you might realize that you paid to get in the directory to screw with your Google rankings, now you get to pay to get out of the directory, again to screw your Google rankings. That's your decision, quit complaining about it like it's someone else's fault. Not everyone has to run around in circles because you're cleaning up the very mess that you made

    Heh.

    In any case, if these links really did harm a site - which is arguable - then it doesn't take a rocket scientist to guess the next step. Site owners would be submitting their competitors links to directories thick and fast.

    Cue Matt Cutts on negative SEO....

    Recovery Not Guaranteed

    Many sites don't recover from Google penalties, no matter what they do.

    It's conceivable that a site could have a permanent flag against it no matter what penance has been paid. Google takes into account your history in Adwords, so it's not a stretch to imagine similar flags may continue to exist against domains in their organic results.

    The most common reason is not what they're promoting now, its what they've promoted in the past.
    Why would Google hold that against them? It's probably because of the way affiliates used to churn and burn domains they were promoting in years gone by...

    This may be the reason why some recovered sites just don't rank like they used to after they've returned. They may carry permanent negative flags.

    However, the reduced rankings and traffic when/if a site does return may have nothing to do with low-quality links or previous behaviour. There are many other factors involved in ranking and Google's algorithm updates aren't sitting still, so it's always difficult to pin down.

    Which is why the SEO environment can be a paranoid place.

    Do Brands Escape?

    Matt Cutts is on record discussing big brands, saying they get punished, too. You may recall the case of Interflora UK.

    Google may well punish big brands, but the punishment might be quite different to the punishment handed out to a no-brand site. It will be easier for a big brand to return, because if Google don't show what Google users expect to see in the SERPs then Google looks deficient.

    Take, for example, this report received - amusingly - by the BBC:

    I am a representative of the BBC site and on Saturday we got a 'notice of detected unnatural links'. Given the BBC site is so huge, with so many independently run sub sections, with literally thousands or agents and authors, can you give us a little clue as to where we might look for these 'unnatural links

    If I was the BBC webmaster, I wouldn't bother. Google isn't going to dump the BBC sites as Google would look deficient. If Google has problems with some of the links pointing to the BBC, then perhaps Google should sort it out.

    Take It On The Chin, Move On

    Many of those who engaged in aggressive link tactics knew the deal. They went looking for an artificial boost in relevancy, and as a result of link building, they achieved a boost in the SERPs.

    That is playing the game that Google, a search engine that factors in backlinks, "designed". By design, Google rewards well-linked sites by ranking them above others.

    The site owners enjoyed the pay-off at the expense of their less aggressive competitors. The downside - there's always a downside - is that Google may spot the artificial boost in relevancy, now or in the future, and and may slam the domain as a result.

    That's part of the game, too.

    Some cry about it, but Google doesn't care about crying site owners, so site owners should build that risk into their business case from the get go.

    Strategically, there are two main ways of looking at "the game":

    Whack A Mole: Use aggressive linking for domains you're prepared to lose. If you get burned, then that's a cost of playing the game. Run multiple domains using different link graphs for each and hope that a few survive at any one time, thus keeping you in the game. If some domains get taken out, then take it on the chin. Try to get reinstated, and if you can't, then torch them and move on.

    Ignore Google: If you operate like Google doesn't exist, then it's pretty unlikely Google will slam you, although there are no guarantees. In any case, a penalty and a low ranking are the same thing in terms of outcome.

    Take one step back. If your business relies on Google rankings, then that's a business risk. If you rely entirely on Google rankings, then that's a big business risk. I'm not suggesting it's not a risk worth taking, but only you can answer that what risks make sense for your business.

    If the whack a mole strategy is not for you, and you want to lower the business risk of Google's whims, then it makes sense to diversify the ways in which you get traffic so that if one traffic stream fails, then all is not lost. If you're playing for the long term, then establishing brand, diversifying traffic, and treating organic SEO traffic as a bonus should be considerations. You then don't need to worry about what Google may or may not do as Google aren't fueling your engine.

    Some people run both these strategies simultaneously, which is an understandable way of managing risk. Most people probably sit somewhere in the middle and hope for the best.

    Link Building Going Forward

    The effect of Google's fear, uncertainty and doubt strategy is that a lot of site owners are going to be running scared or confused, or both.

    Just what is acceptable?

    Trouble is, what is deemed acceptable today might be unacceptable next week. It's pretty difficult, if not impossible, for a site owner to wind the clock back once they undertake a link strategy, and who knows what will be deemed unacceptable in a years time.

    Of course, Google doesn't want site owners to think in terms of a "link strategy", if the aim of said link strategy is to "inflate rankings". That maxim has remained constant.

    If you want to take a low-risk approach, then it pays to think of Google traffic as a bonus. Brett Tabke, founder of WebmasterWorld, used to keep a sticker on his monitor that said "Pretend The Search Engines Don't Exist", or words to that effect. I'm reminded of how useful that message still is today, as it's a prompt to think strategically beyond SEO. If you disappeared from Google today, would your business survive? If the answer is no, then you should revise your strategy.

    Is there a middle ground?

    Here are a few approaches to link building that will likely stand the test of time, and incorporate strategy that provides resilience from Google's whims. The key is having links for reasons besides SEO, even if you part of their value is higher rankings.

    1. Publisher

    Publish relevant, valuable content, as determined by your audience.

    It's no longer enough to publish pages of information on a topic, the information must have demonstrable utility i.e. other people need to deem it valuable, reference it, visit it, and talk about it. Instead of putting your money into buying links, you put your money into content development and then marketing it to people. The links will likely follow. This is passive link acquisition.

    It's unlikely these types of links will ever be a problem, as the link graph is not going to look contrived. If any poor quality links slip into this link graph, then they're not going to be the dominant feature. The other signals will likely trump them and therefore diminish their impact.

    Build brand based on unique, high quality information, and then market it to people by via multiple channels, and the links tend to follow, which then boost your ranking in Google. Provide a high degree of utility, first and foremost.

    One problem with this model is that it's easy for other people to steal your utility. This is a big problem and prevents investment in quality content. One way of getting around this is to use some content as loss-leader and lock the rest away behind pay-walls. You give the outside world, and Google, just enough, but if they want the rest, then they're going to need to sign up.

    Think carefully about the return on giving the whole farm away to a crawler. Think about providing utility, not "content".

    2. Differentiation

    There is huge first mover advantage when it comes to getting links.

    If a new field opens up, and you get there first, or early, then it's relatively easy to build a credible link graph. As a field expands, the next layer involves a lot of meta activity i.e. bloggers, media and other information curators writing about that activity. At the start of any niche, there aren't many players to talk about, so the early movers get all the links.

    As a field matures, you get a phenomenon Mike Grehan aptly characterised as "filthy linking rich"

    The law of "preferential attachment" as it is also known, wherein new links on the web are more likely to go to sites that already have many links, proves that the scheme is inherently biased against new and unknown pages. When search engines constantly return popular pages at the top of the pile, more web users discover those pages and more web users are likely to link to them

    Those who got all those links early on will receive more and more links over time because they are top of the results. They just need to keep doing what they're doing. It becomes very difficult for late entrants to beat them unless they do something extraordinary. By definition, that probably means shifting the niche to a new niche.

    If you're late to a crowded field, then you need to think in terms of differentiation. What can you offer the rest do not? New content in such fields must be remarkable i.e worth remarking upon.

    Is that field moving in a new direction? If so, can you pivot in that direction and be first mover in that direction? Look not where a niche currently is, but where it's going, then position ahead of it.

    "Same old, same old content" doesn't get linked to, engaged with, ranked, or remarked upon - and why should it? The web is not short of content. The web has so much content that companies like Google have made billions trying to reduce it to a manageable set of ten links

    3. Brand

    Brand is the ultimate Google-protection tactic.

    It's not that brands don't get hammered by Google occasionally, because they do. But what tends to differ is the sentence handed down. The bigger the brand, the lighter the sentence, or the shorter the sentence, because no matter how much WalMart or The Office Of The President Of The United States Of America spams Google, Google must show such sites. I'm not suggesting these sites engage in aggressive SEO tactics, or need to, but we know they'll always be in Google.

    You don't have to be a big brand. You do need search volume on your distinctive brand name. If you're well known enough in your niche i.e. you attract significant type-in search volume, Google must show you or appear deficient.

    This is not to say having a brand means you can get away with poor behavior. But the more type-in traffic for your brand, the more pressure there is on Google to rank you.

    Links to a brand name will almost never look forced in the same way a link in a footer to "cheap online pharmacy" looks forced. People know your name, and they link to you by name , they talk about you by name - naturally.

    The more generic your site, the more vulnerable you are, as it's very difficult to own a space when you're aligning with generic keyword terms. The links are always going to look a little - or a lot - forced.

    This is not to say you shouldn't get links with keywords in them, but build a distinctive brand, too. The link graph will appear mostly natural - because it is. A few low quality links won't trump the good signals created by a lot of natural brand links.

    4. Engagement

    The web is a place.

    This placed is filled with people. There are relationships between people. Relationships between people on the web, are almost always expressed as a link. It might be a Facebook link, a Twitter link, a comment link, a blog link, but they're all links. It doesn't matter if they're crawlable or not, or if they're no-followed, or not, it still indicates a relationship.

    If Google is to survive, it must figure out these relationships.

    That's why all links - apart from negative SEO - are good links. The more signals of a real relationship, the better you *should* be ranked, because you are more relevant, in an objective sense.

    So look for ways to foster relationships and engagement. It might be guest posting. It might be commenting on someone elses site. It might be contributing to forums. It might be interviewing people. It might be accepting interviews. It might be forging relationships with the press. It might be forging relationships with business organisations. It might be contributing to charity. It might be running competitions. It might be attending conferences. It might be linking out to influential people.

    It's all networking.

    And wherever you network, you should be getting links as a byproduct.

    One potential problem:

    Provide long - well, longer than 400 words - unique, editorial articles. Articles also need get linked to, and engaged with. Articles need to be placed on sites they'll be seen, as opposed to content farms.

    Ask yourself "am I providing genuine utility?"

    5. Fill A Need

    This is similar to differentiation, but a little more focused.

    Think about the problems people have in a niche. The really hard problems to solve. "How to", "tips", "advice", and so on.

    Solving a genuine problem for people tends to make people feel a sense of obligation, especially if they would otherwise have to pay for that help. If you can twist that obligation towards getting a link, all the better. For example, "if this article/video/whatever helped you, no payment necessary! But it would be great if you link to us/follow us on Twitter/" and so on. It doesn't need to be that overt, although sometimes overt is what is required. It fosters engagement. It builds your network. And it builds links.

    Think about ways you can integrate a call-to-action that results in a link of some kind.

    Coda

    In other news, Caesars Palace bans Google :)

    Categories: