One of the marketing platforms that has emerged in the last few years has been SlideShare. If you haven’t seen the site yet, it’s where you can post your slide decks for others to view, and follow a group of people so you know when they’ve posted new presentations. The site provides a way for people to see what you’ve presented if they weren’t at your speaking engagement, or as an easily linked-to repository for your slides after the presentation. To get an idea of the additional reach SlideShare can provide, consider a presentation I gave last month in Portland on Structured Data and Semantic SEO. This was at an SEMPDX event where I spoke to a crowd of about 75 people. On SlideShare, this presentation deck has 3,847 views. On a grander scale, this presentation on the Top Tools for Learning has over 55,000 views and was posted only five days before reaching that number.
The use case for audience members to have an easy place to access your slide decks is a clear benefit. However, I’m not sure how much people can get out of viewing the decks without actually being at the presentation. There’s almost universal agreement that the best presentations use slide decks as a visual sidekick to the real attraction: listening to the speaker. How useful can it be to view the slide decks without the critically important context of hearing the speaker? Can a deck serve dual purposes of being an entertaining presentation as well as coherent reference material on SlideShare?
Speakers in the search marketing world have come a long way
I think it’s a good sign that as presenters, we are even thinking about this. I certainly feel like I’ve come a long way since I started speaking in 2005. When I was getting acclimated to the SEO world, I felt like the two best speakers were Marshall Simmonds and Bill Hunt (Danny Sullivan doesn’t count. By this point he was mostly doing keynote style presentations that could afford to be more entertaining than informational. Also, he went to a terrible high school). Marshall and Bill were (and still are) able to present actionable search tactics in an engaging way, using Powerpoint as an enhancement rather than as the engine of their talk.
Fast forward to 2012, and Rand Fishkin and Wil Reynolds are two of our industry’s best speakers. Both have the ability to be passionate and entertaining in their talks, but also provide several higher-level strategic takeaways as well as tactical tips. It’s easier said than done. I tend to pack my presentations with info, but they can be very dry. If you’re not engaging the audience instead of stepping them through Powerpoint, even your best takeaways may be glossed over.
The bullet point era
That’s not to say our slide decks have necessarily evolved at the same rate. I’m still seeing (and using) a lot of very dense screenshots and slides full of bullet points. Check out this beauty from one of my earliest decks in that era:
Oof. Thankfully, I didn’t read the entire list of spiders. As I refined my decks, I moved on to more screenshots than lists of bullet points. but these also lacked obvious context. Like many others in our industry, I keenly watched the evolution of Rand Fishkin’s Powerpoint strategy of striving for one thematic point per slide, with a contextual section at the bottom containing a snippet of text and any relevant links. Here’s what this looks like in one my current decks:
If you see this slide on SlideShare, you’d get the basic gist of it without hearing me speak. When I present the slide, I have a few useful talking points around it that aren’t immediately obvious, but people in the audience at my presentation shouldn’t be focused on reading that slide instead of listening to what I’m saying about it. Above the callout at the bottom? That’s a densely packed screenshot from hell. I don’t consider it a problem on SlideShare, but I really don’t want my live audience to squint and figure out what the hell is actually on that screenshot.
There are exceptions to this general trend in our industry. Check out Mike King’s deck from Mozcon this year. That’s a beautiful deck. It has a narrative, easily captured takeaways at the bottom of the deck, and it even works pretty well on SlideShare. I’d wager a guess that this deck came close to a hundred hours for Mike (and his team) to construct. It’s not trivial to build a deck that works live as a visually compelling aid to your talk, yet has value online after the presentation.
The new golden age of Haiku Deck
This brings me to Haiku Deck. In a nutshell, Haiku Deck gives you a wide array of well-designed templates to choose from. You write a headline and subhead, and then you can search for pictures available with the Creative Commons license that match keywords used in your text. Here’s an example slide I made for the SMX East show:
Haiku Deck’s visuals are high resolution, and you can usually find an image that resonates around the theme of your slide. People aren’t going to be trying to read through a mess. This is a huge improvement over the onslaught of Times New Roman bullet points that is the bane of every conference. There’s a catch though:
“The best way to design slides for SlideShare isn’t the same as the best way to create slides to actually use in a presentation.” - This whole interview with Joby Blume on the Haiku Deck blog is worth a read. His point is that if the slides are self-explanatory, they’re great for SlideShare but audience members can just read them and ignore the presenter. But if they consist of abstract visuals, you are in essence giving a speech, and the residual value of the deck on SlideShare is basically zero. Let’s hear from Joby again:
“We sometimes need our slides to help us get the point across, but we can’t do that if we put up a beautiful picture of a snow-capped mountain when we are talking about complex derivatives.”
Especially when thousands of folks are going to download the deck on SlideShare afterwards. That particular truth is why I’m struggling to do an entire deck with Haiku Deck without challenging screenshots or diagrams. I couldn’t find an easy way within Haiku Deck to include even a small numbered list or pop-out, so it doesn’t seem ideal for screenshots or pointing out additional resources. Since the better SEO presentations tend to revolve around tactics and tools, it’s tough to convey actionable information with only a pretty picture and a couple headlines. So while I do believe Haiku Deck is an improvement over the long national nightmare that is Powerpoint, I’m not sure how to use it for presentations that aren’t in the “entertaining/inspirational speech” category. One solution is to only use one slide per ‘Do/Don’t’ or tactic. That means if I was going over how to choose an appropriate structured data markup, I may use 20 slides to convey my talking points. If that’s just one part of my talk, we could be looking at 100+ slides to get all the information across in a 20 minute presenatation. Oy vey.
Is it worth it to make a version just for SlideShare?
Which brings me back around to SlideShare. I believe my best slide decks are geared towards audience members, but kind of lousy for folks on SlideShare who view them without any context. Upgrading my visuals with Haiku Deck may improve the transitional slides, but will probably make my instructional slides worse. They’ll appear to be full of one or two basic instructions at best, and total cliches at worst. Given that we’re talking about maybe 10x-100x as many people will view your material online than see it person, and it seems like a worthwhile effort to make a coherent online version.
Here’s what I think my answer is, to provide the best live presentations as well as useful recaps on SlideShare:
1. Make two decks. I know, even making one deck is a complete pain in the ass. The deck for the live presentation should be heavy on the visual aid slides and contain less Powerpoint diagrams and text. The SlideShare deck should provide slides that have screenshots with extended text and even actionable bullet points if necessary. On SlideShare, your bullet points aren’t fighting with anyone for attention.
2. Better screenshots. Someone please come up with a way to provide enhanced screenshots with legible text and beautiful popouts/callouts. For now, I’ll still be using Snagit and annotating the screenshot at the bottom of the slide, but it’s still visually unappealing compared with what you can generate with a tool like Haiku Deck. Big screenshots full of text look terrible. Smaller screenshots with big ungainly pixelated text? That can be even worse.
3. Less speaking gigs. Putting 100+ hours into the deck and talk for each conference is not a small endeavor. I can’t see myself doing this for a dozen conferences per year. Not to mention that you need to maintain this quality level for any client or in-house presentations. I’m thinking I’ll have to be very picky about where I speak if I’m going to put in significantly more work on the deck(s).
The reality is that you never know which speaking engagement is the game changer for you, whether it opens a door for a new career path, partnership opportunity, or any number of positive outcomes. Only now this could all happen via SlideShare or another online location for your presentation. None of us we’re satisfied with the bullet point era of last decade. Now we also need to avoid providing thousands of people online a deck of pretty pictures without any context.
I’d love to hear what other people are doing with SlideShare, or how they’re using Haiku Deck to create informational slides.
Earlier this year, we were building tools for our news clients so that they could have a dashboard for tracking Twitter activity on reporters and writers (we released free versions of some of these social media tools).
We were discussing different datasets that we thought would be interesting to track with our longtime colleague, Jay Leary. All of us follow politics pretty closely, so we decided to build out a social media dashboard for the 2012 presidential election. The result is CampaignPop, where we measure both the output and social media sentiment for each candidate:
Jay and Tim did a very nice job getting the site up, and my best contribution was staying out of their way and making offhand suggestions. We’re still figuring out the best way to promote the site and use it to surface insights about the effects of social media participation on the campaign. With any luck, we’ll be able to build a predictive model for the results of the election that is useful to political pundits and major media sites.
Have a look and tell us what you think.
I’m here at MozCon, the annual SEO conference hosted by SEOMoz. If you’re a sucker for data, APIs and Excel, there isn’t a better speaker in our industry than Richard Baxter. This morning he did not disappoint as he took the audience through live Excel demos of tools he built using various APIs (MozScape, SharedCount, WIPMania).
At AudienceWise we have built several internal tools on the Twitter API, and recently have been using various APIs to analyze sentiment of tweets. Richard’s presentation this morning inspired me to hack together an Excel tool that performs sentiment analysis on tweets based on Twitter search queries.
It’s pretty straightforward. There are only two prerequisites for getting it to work:
1. Niels Bosma’s SEOTools for Excel. If you use his tools as much as we do, make sure to kick him a donation. He’s planning some neat looking upgrades.
Here’s my presentation from SMX Advanced 2012 on structured data use, tools, and how it fits into the SEO ecosystem. I also included a bit of background on how to get started with semantic technology:
A good general list of updated semantic web development tools can be found at http://www.w3.org/2001/sw/wiki/Tools - good place to find RDF browsers.
URIBurner is a utility that uses URLs as data sources, runs it through their middleware and then gives you output in a variety of data forms that conform to various standards (CSV, Turtle, RDF/XML, JSON). For example, here’s the output for this blog post:
One of the newest RDF database options is Stardog: It’s commercial grade, and I’m not sure of the fee structures yet but this is designed for midscale use (re: speed) rather than NASA-level use (re: size). Given the people behind it, this might be end up being the ‘web scale’ RDF database that provides the hooks most web publishers are used to working with.
4Store is an RDF platform operating under the GNU license
Mulgara is an open source RDF database written in Java.
Sesame is probably the most popular consumer level framework for working with RDF data. It supports connection to remote databases via API, a full SPARQL query set, and compatibility with most RDF file formats (RDF/XML ,Turtle, n-Triples, etc).
Hopefully this will help you get started with rolling your own structured data.
This is a follow up to a post I did a few weeks ago about using Excel to scrape key user data from Twitter. In that post I set up a spreadsheet that was tracking 20 of the most followed people in the SEO industry. After collecting that data on a daily basis over the last few weeks, I am going to use this post to demonstrate some basic analysis that can be done. It is not meant to be an Excel tutorial. For one of those, I recommend you check out Distilled’s guide on Excel for SEO, and of course Richard Baxter has a wealth of Excel tips and tutorials on his blog.
Step 1: Format Your Data
Here’s the spreadsheet that goes along with this post - Twitter Scrape Analysis – Excel
I’ve been collecting Twitter data on these 20 SEOs for about 30 days. That’s about 600 lines of data in my Excel spreadsheet. Pretty modest from a data analysis perspective, but we still need to make sure that things are formatted kindly. There are really only two things we’ll be doing with the data 1) adding a few columns for daily trending, i.e. number of followers gained or lost per day, and 2) creating a pivot table. In order to add the trending we are going to some data sorting, so I suggest that you put your data into a table. (Select all the data and CRTL + L).
Step 2: Basic Analysis Inline with your Data
The spreadsheet of our data is attached, but for quick review, our columns look like this:
There are two columns I added next for trending: one for calculating the daily followers gained or lost and a column to count the number of Tweets broadcast. You could do a similar column for following and listed. Next, I perform a quick and dirty formula to calculate the daily trend. There is probably a more elegant solution, but I found this to work, so I went with it:
- Sort, by handle and then by date (oldest to newest)
- Formula: IF(Table2[[#This Row],[handle]]=A1,Table2[[#This Row],[followers]]-C1,”start”). This function checks to see if the two numbers you are calculating belong to the same handle. Then does basic subtraction. This will give you the difference between the row you are looking at and the day before.
- Clean-up: this is an important step. Copy the entire table and paste values in order to overwrite the formulas with the values. If you don’t, the numbers will change when sorting. Next, sort the follower gain/loss column and delete all the fields that say “start”. We want to make sure we only have number in this column so the pivot tables and graphs translate properly when we are building our visualizations.
Step 3: Visualize
Let’s make the data more meaningful and present that data in a simple way that could bubble up some insights. There are a hundred different tools and visualization methods that you could choose, but I am going to keep it very simple and use a pivot table for this example. The main benefit of a pivot table is to summarize like data points, so in this case we want to see basic trending information by each twitter handle that we are tracking.
In the data tab select the whole table, and under the “insert” ribbon tab click on ‘Pivot’ table. A new Pivot table tab will be generated with the field selector open. In the field selector drag handles into the row labels field and followers. Then, add follower gain/loss and daily tweets to the values field. Make sure to select “value field setting” and then select “average” for each. Excel defaults to SUM, which would provide the sum of all the entries in the data table for the handle.
To make it’s more readable sort by followers by clicking under more sort options next to “Row Labels”. Then select “Descending (Z to A) by:” “Average of Followers”:
Next, add some conditional formatting to each row so some of the outliers pop out. You have to format each row separately or it will use the largest number, in this case dannysullivan’s followers as the baseline for all fields.
The keen observer has probably noticed that using followers gain/loss as a measuring stick to judge one Twitter account from another is not an apples to apples comparison. Followers beget more followers and Danny Sullivan and Matt Cutts have the advantage in this group. What is more telling is to normalize the data and compare the ratio of average follower gained to the sum of total followers. This can be done by selecting a single field of the pivot table under ‘Average of follower gain/loss’ column and then selecting options (in the ribbon)>formulas>calculated fields. Looks something like the below. Simply double-click in ‘follower gain/loss’, divide that by ‘followers’ and hit OK.
Make sure to format the new column into percentage, or it will just round off to zero, and that’s not very useful insight. Your end product should look something like the below and will provide you some insights into which handles are the outliers of your group.
In this case there are a few things that stand out, including the fact that Aaron Wall has been losing followers at the fastest rate. He also has not Tweeted in nearly a month. Is his lack of engagement with his followers causing unfollows? It could be, but correlation is not causation as our friends at SEOmoz remind us. A deeper dive into the data may help us reveal something… Ah-ha! In looking at the raw data we see that aaronwall got dropped by 284 people in an 8 day period (note: he no longer uses this account to Tweet, rather SEObook, but the decrease in followers was dramatically higher over this 8 day period than compared to the other days in this data set). In doing a little digging, a blog posted by AaronWall on SEObook entitled “Educating the Market: Is outing & writing polarizing drivel hate baiting or a service to the community” caused a bit of an uproar. Could the post on SEObook have had a negative impact on the @AaronWall Twitter profile? A similar analysis could be done by a publisher on a journalist who wrote a controversial piece of content. However, to really understand what’s going on we need to explore tone and sentiment. I’ll be mashing this data with open source sentiment analysis tools in future blogs. Stay tuned…
It’s not exactly like he came up with the idea, but at least he didn’t stop it.
If you view the source of the barackobama.com you will see the following piece of ASCII art commented at the top of the HTML doc.
The technological difference between Obama’s and Mitt Romney’s website is stark. Some may even say symbolic, but that’s not for this type of blog. Barry’s web team is using HTML5 and has enough time (and money) on hand to make ASCII art. If you jump over to Mitt Romney’s site and see what is going on, you’ll notice that it is coded up in a more antiquated version of HTML( XHTML 1.0) – not quite cutting edge. It also uses the free, open-source content management system Drupal (great product by the way), which if nothing else is financially inline with fiscal conservative values — perhaps Newt should have taken a page out of this book.
On a separate but somewhat-related note: In the 2008 election we some the Barack Obama campaign leverage social media very effectively to build a grass-roots campaign that raised a record amount of money. I predict in this election we will see the Obama campaign innovate using geo-social/mobile applications. After all, he did join Foursquare a few months back.
1) Please tell me your background and what you do for a living.
2) Please complete the phrase “PPC to SEO as _____ is to ______.” (And please explain your answer.)
3) If businesses are raking in money via paid search, why should they care about SEO?
4) Many objections to SEO revolve around the indefinite, unpredictable nature of the results (which contrasts to the highly precise ROI from PPC). How would you answer that?
5) How can an SEO client determine whether a prospective SEO provider is knowledgeable and capable of achieving excellent results for them?
6) What is a typical SEO engagement for you?