links

Link Building

We’ve been link building for over two decades now and have developed a link integration philosophy aimed at effectiveness and safety for our clients.

Here’s what makes us special

Naturally integrated, undetectable links.
The advantage of our approach to links is in our strict link integration guidelines. We aim for low-to-no detectability. This means that if someone was to take a look at a page where you’re linked from they won’t be able to tell who wanted the link there and if they do, that link will be perfectly justified by purpose and context.

(and here’s what doesn’t)

We don’t subscribe to the quantity and authority method.
If you’re after an exact number of links each month and they all need to be of a certain arbitrary metric provided by your favourite tool, then we’re probably not a good match. If you’re willing to align with our philosophy of natural link integration and undetectable links above all else then we should talk.

Deep Understanding of Links

link-need

Our early research into links on the web dating from 2011 to 2016 shows that the majority of link builders and SEOs simply don’t know how to integrate links well. A trained eye can always spot “the one who wanted the link” and most efforts to make links “look natural” are fruitless.

A 2023 review of the state of the links on the web discovered that absolutely nothing has changed and, in fact, some of the industry’s less impactful rules have also become commonplace in the blogosphere.

Even when a good link is made, it’s generally devalued by less than ideal integration.

Cutting Edge Research

At SMX Munich in 2023 we presented our statistical model of links on the web based on the most common keywords found in natural links on the web and shared the tool with the audience.

Building on this work later that year we started gathering data and planning development of a machine learning model that understands the links on the web intuitively and can generalise well enough to be deployed on a wide variety of link related tasks and workflows.

In early 2024, we successfully trained a general-purpose transformer model and announced a public release of its smaller variant LinkBERT on HuggingFace.

LinkBERT

link model training

LinkBERT is a fine-tuned version of BERT (large, cased) trained on a 4.5GB dataset on top quality content and links on the web. This transformer is designed for binary token classification. When presented with plain text it can predict which parts of that text are most likely to be links / anchor text.

Applications of LinkBERT

LinkBERT’s applications are vast and diverse, tailored to enhance both the efficiency and quality of web content creation and analysis:

  • Anchor Text Suggestion: Acts as a mechanism during internal link optimization, suggesting potential anchor texts to web authors.
  • Evaluation of Existing Links: Assesses the naturalness of link placements within existing content, aiding in the refinement of web pages.
  • Link Placement Guide: Offers guidance to link builders by suggesting optimal placement for links within content.
  • Anchor Text Idea Generator: Provides creative anchor text suggestions to enrich content and improve SEO strategies.
  • Spam and Inorganic SEO Detection: Helps identify unnatural link patterns, contributing to the detection of spam and inorganic SEO tactics.

A public demo of the model is available at: https://dejan.ai/linkbert/