Publishers prefer pitches that demonstrate accuracy and authority
Kristin Tynski is the co-founder and SVP creative at Fractl, a growth marketing agency that has helped Fortune 500 and boutique companies achieve quality media coverage, backlinks, notoriety, and authority.
I recently saw People mention the difficulty of generating content that can get massive attention and links. They suggest that maybe it is better to focus on content without such potential, where few links can be earned, but more consistently and at higher volumes.
In some cases, this can be good advice. However, I want to argue that it is very possible to create content that can consistently generate large amounts of links with high authority. I've found in practice that there is a truly scalable way to create connections with high authority and it's based on two tactics that come together:
- Create news content of interest to large online publishers (newspapers, large blogs, or large niche publishers).
- Pitching publishers in a way that breaks through the noise of their inbox for them to see your content.
How can you use new techniques to achieve consistent and predictable content marketing profits?
The key is data.
Techniques for generating press with data-driven stories
I firmly believe that there is no shortcut to deserving press releases and that only really new, current and interesting content can be successful. Without a doubt, the easiest way to do this predictably is to take a data journalism approach.
One of the best ways to create data-centric content is to use existing data sets to tell a story.
There are tens of thousands – maybe hundreds of thousands – of existing public records that anyone can use to tell new and impactful data-centric stories that can easily lead to massive press and high levels of authoritative links.
Over the past five years, government, NGOs and public corporations have taken major transparency initiatives to make their data more available and accessible.
Additionally, FOIA requests are very common as even more data is shared and made public for journalistic research and storytelling.
Since this data is usually from the government or some other authoritative source, it is often easier to get these stories down to publishers because you don't face the same hurdles in terms of proving accuracy and authority.
Possible road blocks
In particular, the accessibility of government-provided data may vary. There are little or no data standards, and every federal and local authority has different resources to make the data they have easy for third parties to use.
The result is that each data set often has its own problems and complexities. Some are very simple and available in clean and well-documented CSVs or other standard formats.
Unfortunately, others are often difficult to decode, clean up, validate, or even download. Sometimes they're trapped in hard-to-analyze PDFs, fragmented reports, or outdated query search tools that spit out awkward tables.
In order to be able to accurately collect and use many sets of data, a deeper understanding of web scraping and programmatic cleansing and reformatting of data is often required.