Archive for the ‘Uncategorized’ Category

Next Page »

AudienceWise has been Acquired by SEOmoz

January 29th, 2013
by Tim Resnik


Matthew and I are very happy to announce that AudienceWise has been acquired by SEOmoz. Most of the details can be found in this blog post by Rand, the CEO of SEOmoz.

As many or all of you know, SEOmoz is the preeminent developer of software that helps marketers make decisions on how to optimize their web properties for search and social. What you may not know is that there are

Many “BIG” new features and products are coming. In fact, since Matthew and I are power users of the SEOmoz toolset we will initially focus a lot of our time on providing feedback and input on these new initiatives.

As for AudienceWise, we won’t be taking on new clients, but will continue to consult for a limited number of our current clients on an ongoing basis. SEOmoz will not be jumping back into the consulting game and will continue to focus on building great software to help marketers attract customers via inbound channels: SEO and Social Media. If anyone wants more detail, please don’t hesitate to reach out.

Presentations in the Age of SlideShare

October 8th, 2012
by Matthew

One of the marketing platforms that has emerged in the last few years has been SlideShare. If you haven’t seen the site yet, it’s where you can post your slide decks for others to view, and follow a group of people so you know when they’ve posted new presentations. The site provides a way for people to see what you’ve presented if they weren’t at your speaking engagement, or as an easily linked-to repository for your slides after the presentation. To get an idea of the additional reach SlideShare can provide, consider a presentation I gave last month in Portland on Structured Data and Semantic SEO. This was at an SEMPDX event where I spoke to a crowd of about 75 people. On SlideShare, this presentation deck has 3,847 views. On a grander scale, this presentation on the Top Tools for Learning has over 55,000 views and was posted only five days before reaching that number.

The use case for audience members to have an easy place to access your slide decks is a clear benefit. However, I’m not sure how much people can get out of viewing the decks without actually being at the presentation. There’s almost universal agreement that the best presentations use slide decks as a visual sidekick to the real attraction: listening to the speaker. How useful can it be to view the slide decks without the critically important context of hearing the speaker? Can a deck serve dual purposes of being an entertaining presentation as well as coherent reference material on SlideShare?

Speakers in the search marketing world have come a long way

I think it’s a good sign that as presenters, we are even thinking about this. I certainly feel like I’ve come a long way since I started speaking in 2005. When I was getting acclimated to the SEO world, I felt like the two best speakers were Marshall Simmonds and Bill Hunt (Danny Sullivan doesn’t count. By this point he was mostly doing keynote style presentations that could afford to be more entertaining than informational. Also, he went to a terrible high school). Marshall and Bill were (and still are) able to present actionable search tactics in an engaging way, using Powerpoint as an enhancement rather than as the engine of their talk.

Fast forward to 2012, and Rand Fishkin and Wil Reynolds are two of our industry’s best speakers. Both have the ability to be passionate and entertaining in their talks, but also provide several higher-level strategic takeaways as well as tactical tips. It’s easier said than done. I tend to pack my presentations with info, but they can be very dry. If you’re not engaging the audience instead of stepping them through Powerpoint, even your best takeaways may be glossed over.

The bullet point era

That’s not to say our slide decks have necessarily evolved at the same rate.  I’m still seeing (and using) a lot of very dense screenshots and slides full of bullet points. Check out this beauty from one of my earliest decks in that era:

Oof. Thankfully, I didn’t read the entire list of spiders. As I refined my decks, I moved on to more screenshots than lists of bullet points. but these also lacked obvious context. Like many others in our industry, I keenly watched the evolution of Rand Fishkin’s Powerpoint strategy of striving for one thematic point per slide, with a contextual section at the bottom containing a snippet of text and any relevant links. Here’s what this looks like in one my current decks:

Rich Snippets Data

If you see this slide on SlideShare, you’d get the basic gist of it without hearing me speak. When I present the slide, I have a few useful talking points around it that aren’t immediately obvious, but people in the audience at my presentation shouldn’t be focused on reading that slide instead of listening to what I’m saying about it. Above the callout at the bottom? That’s a densely packed screenshot from hell. I don’t consider it a problem on SlideShare, but I really don’t want my live audience to squint and figure out what the hell is actually on that screenshot.

There are exceptions to this general trend in our industry. Check out Mike King’s deck from Mozcon this year. That’s a beautiful deck. It has a narrative, easily captured takeaways at the bottom of the deck, and it even works pretty well on SlideShare. I’d wager a guess that this deck came close to a hundred hours for Mike (and his team) to construct. It’s not trivial to build a deck that works live as a visually compelling aid to your talk, yet has value online after the presentation.

The new golden age of Haiku Deck

This brings me to Haiku Deck. In a nutshell, Haiku Deck gives you a wide array of well-designed templates to choose from. You write a headline and subhead, and then you can search for pictures available with the Creative Commons license that match keywords used in your text. Here’s an example slide I made for the SMX East show:

Haiku Deck Example

Haiku Deck’s visuals are high resolution, and you can usually find an image that resonates around the theme of your slide. People aren’t going to be trying to read through a mess. This is a huge improvement over the onslaught of Times New Roman bullet points that is the bane of every conference. There’s a catch though:

“The best way to design slides for SlideShare isn’t the same as the best way to create slides to actually use in a presentation.” - This whole interview with Joby Blume on the Haiku Deck blog is worth a read. His point is that if the slides are self-explanatory, they’re great for SlideShare but audience members can just read them and ignore the presenter. But if they consist of abstract visuals, you are in essence giving a speech, and the residual value of the deck on SlideShare is basically zero. Let’s hear from Joby again:

“We sometimes need our slides to help us get the point across, but we can’t do that if we put up a beautiful picture of a snow-capped mountain when we are talking about complex derivatives.”

Especially when thousands of folks are going to download the deck on SlideShare afterwards. That particular truth is why I’m struggling to do an entire deck with Haiku Deck without challenging screenshots or diagrams. I couldn’t find an easy way within Haiku Deck to include even a small numbered list or pop-out, so it doesn’t seem ideal for screenshots or pointing out additional resources. Since the better SEO presentations tend to revolve around tactics and tools, it’s tough to convey actionable information with only a pretty picture and a couple headlines. So while I do believe Haiku Deck is an improvement over the long national nightmare that is Powerpoint, I’m not sure how to use it for presentations that aren’t in the “entertaining/inspirational speech” category. One solution is to only use one slide per ‘Do/Don’t’ or tactic. That means if I was going over how to choose an appropriate structured data markup, I may use 20 slides to convey my talking points. If that’s just one part of my talk, we could be looking at 100+ slides to get all the information across in a 20 minute presenatation. Oy vey.

Is it worth it to make a version just for SlideShare?

Which brings me back around to SlideShare. I believe my best slide decks are geared towards audience members, but kind of lousy for folks on SlideShare who view them without any context. Upgrading my visuals with Haiku Deck may improve the transitional slides, but will probably make my instructional slides worse. They’ll appear to be full of one or two basic instructions at best, and total cliches at worst. Given that we’re talking about maybe 10x-100x as many people will view your material online than see it person, and it seems like a worthwhile effort to make a coherent online version.

Here’s what I think my answer is, to provide the best live presentations as well as useful recaps on SlideShare:

1. Make two decks. I know, even making one deck is a complete pain in the ass. The deck for the live presentation should be heavy on the visual aid slides and contain less Powerpoint diagrams and text. The SlideShare deck should provide slides that have screenshots with extended text and even actionable bullet points if necessary. On SlideShare, your bullet points aren’t fighting with anyone for attention.

2. Better screenshots. Someone please come up with a way to provide enhanced screenshots with legible text and beautiful popouts/callouts. For now, I’ll still be using Snagit and annotating the screenshot at the bottom of the slide, but it’s still visually unappealing compared with what you can generate with a tool like Haiku Deck. Big screenshots full of text look terrible. Smaller screenshots with big ungainly pixelated text? That can be even worse.

3. Less speaking gigs. Putting 100+ hours into the deck and talk for each conference is not a small endeavor. I can’t see myself doing this for a dozen conferences per year. Not to mention that you need to maintain this quality level for any client or in-house presentations. I’m thinking I’ll have to be very picky about where I speak if I’m going to put in significantly more work on the deck(s).

The reality is that you never know which speaking engagement is the game changer for you, whether it opens a door for a new career path, partnership opportunity, or any number of positive outcomes. Only now this could all happen via SlideShare or another online location for your presentation. None of us we’re satisfied with the bullet point era of last decade. Now we also need to avoid providing thousands of people online a deck of pretty pictures without any context.

I’d love to hear what other people are doing with SlideShare, or how they’re using Haiku Deck to create informational slides.

CampaignPop – Twitter Stats for the 2012 Presidential Election

September 4th, 2012
by Matthew

Earlier this year, we were building tools for our news clients so that they could have a dashboard for tracking Twitter activity on reporters and writers (we released free versions of some of these social media tools).

We were discussing different datasets that we thought would be interesting to track with our longtime colleague, Jay Leary. All of us follow politics pretty closely, so we decided to build out a social media dashboard for the 2012 presidential election. The result is CampaignPop, where we measure both the output and social media sentiment for each candidate:

CampaignPop Twitter Sentiment Scores

 

Jay and Tim did a very nice job getting the site up, and my best contribution was staying out of their way and making offhand suggestions. We’re still figuring out the best way to promote the site and use it to surface insights about the effects of social media participation on the campaign. With any luck, we’ll be able to build a predictive model for the results of the election that is useful to political pundits and major media sites.

Have a look and tell us what you think.

Twitter Sentiment Tool for Excel

July 26th, 2012
by Tim Resnik

I’m here at MozCon, the annual SEO conference hosted by SEOMoz. If you’re a sucker for data, APIs and Excel, there isn’t a better speaker in our industry than Richard Baxter. This morning he did not disappoint as he took the audience through live Excel demos of tools he built using various APIs (MozScape, SharedCount, WIPMania).

At AudienceWise we have built several internal tools on the Twitter API, and recently have been using various APIs to analyze sentiment of tweets. Richard’s presentation this morning inspired me to hack together an Excel tool that performs sentiment analysis on tweets based on Twitter search queries.

It can be downloaded here and used for free.

It’s pretty straightforward. There are only two prerequisites for getting it to work:

1. Niels Bosma’s SEOTools for Excel. If you use his tools as much as we do, make sure to kick him a donation. He’s planning some neat looking upgrades.

2. A free API key from ViralHeat for the sentiment call. Once you have logged in go here to grab your API key.

sentiment screen shot

 

 

 

 

 

 

 

Catching Up With Structured Data at SMX Advanced

June 8th, 2012
by Matthew

Here’s my presentation from SMX Advanced 2012 on structured data use, tools, and how it fits into the SEO ecosystem. I also included a bit of background on how to get started with semantic technology:

SMX Advanced 2012 – Catching up with the Semantic Web One follow up item that I’ve been asked a lot since the presentation is what tools and solutions there are for setting up a RDF database, since that’s the natural first step once you get a handle on writing SPARQL queries, as well as tools for converting data sources into linked data objects.

A good general list of updated semantic web development tools can be found at http://www.w3.org/2001/sw/wiki/Tools - good place to find RDF browsers.

URIBurner is a utility that uses URLs as data sources, runs it through their middleware and then gives you output in a variety of data forms that conform to various standards (CSV, Turtle, RDF/XML, JSON). For example, here’s the output for this blog post:

URIBurner output for this AudienceWise post

One of the newest RDF database options is Stardog: It’s commercial grade, and I’m not sure of the fee structures yet but this is designed for midscale use (re: speed) rather than NASA-level use (re: size). Given the people behind it, this might be end up being the ‘web scale’ RDF database that provides the hooks most web publishers are used to working with.

4Store is an RDF platform operating under the GNU license

Mulgara is an open source RDF database written in Java.

Sesame is probably the most popular consumer level framework for working with RDF data. It supports connection to remote databases via API, a full SPARQL query set, and compatibility with most RDF file formats (RDF/XML ,Turtle, n-Triples, etc).

Hopefully this will help you get started with rolling your own structured data.

 

Enhanced by Zemanta

Mining Twitter Data with Excel and other tips for Twitter Analytics

April 23rd, 2012
by Tim Resnik

We work with several publishing clients that manage a multitude of Twitter accounts. We frequently run across the problem of spending too much time compiling a large cross section of key Twitter stats for multiple accounts. For example, we have a client that has over 20 Twitter accounts for unique brands unique brands. Trying to collect and analyze follower and status update counts is a tedious manual process, and not recommended.

There are two methods to accomplish data scraping from Twitter:

1)      Develop a web application using the Twitter API

2)      Use a mashup of tools mixed with Excel or Google Docs.

Building a web application to perform this scraping function has obvious financial and time barriers, but may be worth the effort or licensing fee in the long run.

Using Google Docs and ImportXML can handle this function nicely and has been covered thoroughly (I mean thoroughly) by Distilled in their ImportXML guide . Also, John Doherty makes good use of the API and Google Docs and has created a link prospecting tool.  What I want to cover in this blog is how to collect basic Twitter following/follower stats across several accounts using Excel and Niels Bosma’s badass plugin: SEO Tools for Excel. Since the SEO Tools plugin is doing all the heavy lifting in this example, Niels should really get all the credit.

Step 1: Install SEO Tools for Excel

There are a lot of other sweet features, but for this example we are just going to use the scraping function:

 SEO Tools for Excel

Step 2: Build a list of Twitter handles that you would like to collect via the Twitter API

First you need to figure out who you are interested in grabbing the data for. This could be your competition, your employees, different products within your company, punk bands, or whatever group of Tweeters you want to analyze. You can use services like  WeFollow and Listorious to generate ideas. Compile the list and put the handles in the first column of your spreadsheet (don’t include a preceding @). With this particular XML feed here are some of the relevant things you can pull:

  • ID: the numerical unique identifier for the account. This is a handy key to have because it will never change while the screen name can be changed by the owner of the account
  • Name: the name of the person who registered the account
  • Location: the geo-location of where the account was created
  • Description: the user-created description for the account that shows up right under the screen name on Twitter.
  • Profile img URL: the location of the profile image for that account
  • Followers count: number of followers that this account has
  • Friends count: number of accounts that this account is following
  • Created at: the date and time that this account was created
  • Status count: number of tweets since the account was created
  • Listed count: the number of times that this account has been included in another accounts lists
Or, checkout all the gory details

Here is what my column headers look like:

 Excel Column Headers

You can download the example XLS file here. The formulas are all pre-populated so all you need to do is enter the Twitter names (30 max due to API rate limiting) that you want the information for in the “Twitter Handle” column. If you want to build it out yourself, below are the details.

Step 3: Construct the API Request URL (sheet 2)

We must create the proper URL to be able to pull this information from Twitter. In this case we are using the Twitter REST API resource: GET users/show. This is one of a handful of API calls that does not require user authentication.

There are two required components and one optional to the URL.

Required:

  • Request URL: https://api.twitter.com/1/users/show.xml?screen_name=

+

  • Screen name: the Twitter handle you would like information for

So, it will end up looking like this:

https://api.twitter.com/1/users/show.xml?screen_name=mattcutts

Optional: include entities. This API requests returns the latest tweet for the user info requested. If you would like to include information about the tweet such as user mentions, hashtags or associated URLs, then also include the following to the end of the URL: &include_entities=true. The final URL would look like this:

https://api.twitter.com/1/users/show.xml?screen_name=mattcutts&include_entities=true.

We now have to construct the URL. In sheet 2 of the spreadsheet you can see that I am simply concatenating two fields: the Twitter handle from sheet 1 and the URL from the current sheet.

Step 4: Write Your Formulas and Populate the Array

Nothing too fancy here thanks to SEO Tools for Excel. There is a pre-built function that does all the heavy lifting. You can select the XPath (reads XML) or JSON option with the Twitter API. In this example we will use XML.

The XPath function has two inputs: the URL to call the XML file, which we constructed above and is in sheet 2, and the instructions for selecting the right information in the file (the proper node). Here is a very basic write up of how to use XPath to select nodes within an XML file.

 

Once you have your formula written for each column, fill in the rest of your array and wait a bit as it fills in the data. The Twitter REST API is limited to 150 requests per hour and each cell in the array you have just created is an API call. In this example we have 5 columns and 25 Twitter handles so filling out this array once will be just 25 short of the hourly cap.

Step 5: Some Basic Analysis

Now that we have an array of data for a group of Twitter users we can do some very basic analysis. In follow up blogs, I will collect the data over time and do a little deeper look at trending analysis.

In sheet 3 you will find a few graphs based on this data. It is important to note that the data in sheet 1 is formatted as text because Excel is keying off of the formula. This makes analysis difficult on that tab itself. I copied and pasted the values to sheet 3(Copy>Paste Special>Values) and then converted the text to numbers.

I wanted to compare the number of followers juxtaposed with the average number of tweets per day. Twitter returns the date as (Day of week, Month, Day, Time Stamp and Year). In order to format the date so Excel can understand it you must pull the right information out of the ‘Date Created’ :

Format Dates in Excel

Excel still can’t understand it because it is reading the formula instead of what is being visually shown in the cell. In order to get it to work, use Copy>Paste Special>Values to overwrite the formula, go to Format Cells>Date> format as 04/23/2012.Create a new column called Avg. Tweets per day. You now need to figure out how many days have passed from the time the account was created and divide into that the total number of tweets.

Average Tweets per Day

Next, you can select the data in the Followers and Avg. Tweets per Day columns to generate your graph. I won’t go over the details on chart formatting here, but here is the end result:

followers to tweets graph

 

Follow up post: Twitter Data Scraping and Insights Using Excel

 

Obama Tries for Geek Cred with ASCII Art

March 9th, 2012
by Tim Resnik

It’s not exactly like he came up with the idea, but at least he didn’t stop it.

If you view the source of the  barackobama.com you will see the following piece of ASCII art commented at the top of the HTML doc.

Obama ASCII art

 

The technological difference between Obama’s and Mitt Romney’s website is stark. Some may even say symbolic, but that’s not for this type of blog. Barry’s web team is using HTML5  and has enough time (and money) on hand to make ASCII art. If you jump over to Mitt Romney’s site and see what is going on, you’ll notice that it is coded up in a more antiquated version of HTML( XHTML 1.0) – not quite  cutting edge. It also uses the free, open-source content management system Drupal (great product by the way), which if nothing else is financially inline with fiscal conservative values — perhaps Newt should have taken a page out of this book.

 

On a separate but somewhat-related note: In the 2008 election we some the Barack Obama campaign leverage social media very effectively to build a grass-roots campaign that raised a record amount of money. I predict in this election we will see the Obama campaign innovate using geo-social/mobile applications. After all, he did join Foursquare a few months back.

Interview with Matthew Brown by PPC Associates

November 28th, 2011
by Tim Resnik

PPC Associates posted an interview with AudienceWise co-founder Matthew Brown about the current state of SEO and inbound marketing. 

Questions include:

1)       Please tell me your background and what you do for a living.

2)       Please complete the phrase “PPC to SEO as _____ is to ______.” (And please explain your answer.)

3)       If businesses are raking in money via paid search, why should they care about SEO?

4)      Many objections to SEO revolve around the indefinite, unpredictable nature of the results (which contrasts to the highly precise ROI from PPC). How would you answer that?

5)      How can an SEO client determine whether a prospective SEO provider is knowledgeable and capable of achieving excellent results for them?

6)      What is a typical SEO engagement for you?

 

 

 

 

The Google Plus Killer Feature – Search (or at least it could be)

September 20th, 2011
by Tim Resnik

How do you find people, businesses or topics on social networks? The logical thing to do is use the built in search tools for the  site you’re on. If only it were so easy. Seemingly of late I find myself going to Google and doing a “site:” operator to search Twitter, Linkedin, and probably the worst sinner of all, Facebook. On the other hand, not surprisingly, Google has used its bread and butter skills of indexation and display to make the Google+ search experience far more robust than its social competitors.

Let’s look at a really simple example. A lot of people like Coca-Cola. In fact they are one of the most recognized and valuable brands in the world . Based on that fact, we are going to make the assumption that if someone is searching for the common alias “coke” in a search box, they are generally looking for the parent brand Coca-Cola (unless it happens to be someone that really enjoyed the 80s; people usually infer that if they don’t refine their search they are going to be returned the brand, not the illicit substance). If you perform a basic search for “coke” in Facebook, you get listings that are categorized by Pages, Groups, Apps and People. The top Page listing is an exact title match of a page that has nearly a million ‘likes’, but the Coca-Cola page is nowhere to be found.

Facebook Search

Should Coca-Cola, which doesn’t have an exact match in the Page title, come up before Coke Studio? Google sure thinks so, and I am willing to bet that the 98+% of the people searching for “coke” on Facebook are looking for the official company page. The conclusion is not surprising: Google’s search algorithm appears far more sophisticated than Facebook’s. After all, it is what the empire is built on.

 

Facebook Google Search

 

The adoption of Google+ has been significant, accruing over 25 million users in the last few months (in invite only mode, which was lifted today), however, the usage and sharing has appeared to drop off a cliff (I have no explicit evidence of this, but rather stating a common sentiment within folks in online marketing circles). For Google+ to be a formidable competitor to Facebook, they need to leverage what they are really good at: discovery.

The screen shot below shows the exact same search that I did in Facebook. The test is far from scientific and is not even comparing apples to apples since G+ doesn’t allow business profiles. However, two things are clear: First, the results more closely match my intent, and I assume that once G+ allows business pages that the Coca-Cola business page would be in place of the trucker hat chick. Second, the results page itself has superior organization and provides blended results by default and filters for people, posts and Sparks separately.

Gooogle Plus Search

As Google+ grows and continues to innovate, Facebook will surely be forced to “innovate” here (and vice versa, of course). See: Facebook announcing asynchronous relationships. For G+ to finally throw its full weight into the social media arena, it must leverage what its empire is built on as the killer competitive advantage.

Google + Shares Showing up in Search Results, are mis-labeled

July 13th, 2011
by Tim Resnik

I was doing a quick search for keyword tools on Google today and noticed that +1′s from people in my Google + network were showing up under the search results. I also found that Google has mis-labeled Google + as Google +1 in the results. Branding hiccup or bug?

Next Page »