Adam Green
Twitter API Consultant
adam@140dev.com
781-879-2960
@140dev

The Twitter economy will employ a diverse labor force

by Adam Green on March 20, 2014

blog_home

in Future of Twitter

This is the third part of a series on the future of Twitter development: Part I, Part II.

In order for Twitter to reach everywhere, a skilled labor force of API developers is needed around the world. Developers make it possible to integrate Twitter into businesses in a more useful and personalized manner. This type of integration will give Twitter ubiquity and longevity, two attributes that are almost impossible for competitors to overcome.

The API developer community is more complex than most realize. New people enter continually as students and self-taught programmers. Others come in as corporate developers who are told to work on Twitter projects. As Twitter related companies in the top tier continue to grow multi-million dollar products, they bring in experienced coders from outside the Twitter world to manage big databases and build enterprise-level tools. This is a dynamic and growing group.

Twitter Economy
Developer Labor Force

One of the great strengths of the Twitter API is that it can be run by self-taught programmers who quickly turn themselves into productive tool builders. Self-taught programmers often come out of an actual need within a business or other organization. Software start-ups and consulting companies are born as a result of this group’s work.

Students are the other influx of talent I often see in the Twitter world. Doing a research project on Twitter data now seems to be a standard task for Computer Science students. That skill set will mean that Twitter will be an obvious place for this cohort to reach for when adding features to any system. It becomes a generation’s standard for social media data.

Marketing automation companies are doing so well that HubSpot has IPO buzz. Big Data companies based on Twitter are also booming and signing multi-million dollar partnerships.

So we have an established API infrastructure, a billion dollar ad revenue stream to work beside, and a rich set of tools built by a growing and diverse tech community. It sounds like my road trip view of economics is about to play out big-time with Twitter.

{ 0 comments }

Following the Twitter ad revenue stream

by Adam Green on March 19, 2014

blog_home

in Future of Twitter

This is the second part of a series on the future of Twitter development: Part I, Part III.

Twitter ads are the fuel that will soon run the Twitter API development world. I spent a lot of time thinking about this on my recent road trip. Data from users interacting with Twitter ads can be expanded upon and optimized by API developers. This creates a win-win-win model: developers make money helping clients use Twitter ads more effectively, businesses get a better return on their investment and Twitter sells more ads to happy clients.

The current Twitter for Business marketing campaign is convincing businesses to spend increasing amounts of ad money on Twitter; $600 million last year and over one billion projected this year. You know what’s cool? Breaking that threshold and envisioning how the economics of Twitter ad buying will mature as billions of dollars start flowing.

Developers will be able to make money from this revenue stream by extending and optimizing Twitter ads for businesses. They will be the labor force that helps ad buyers leverage the leads and data they collect. For instance, once you have thousands of screen names and email addresses from Lead Generation Cards, you still need to engage with these people. Ad buyers could simply treat these leads as a normal mailing list, but that misses the whole formula of Twitter.

Leads plus engagement produces relationships. Energy strategically applied to relationships produces communities of loyal customers and supporters.

The integration of API developers with the Twitter ad buying cycle creates a more robust and innovative economy that benefits all parties.

Twitter Economy
Twitter Economy

We will have an economy where the financial interests of Twitter, businesses and developers are aligned. There will be large third party companies that sell general solutions, and smaller development teams who customize the results from Twitter’s API and Twitter ad data to provide vertical market solutions. I’ll describe this future API developer labor force in more detail tomorrow.

{ 0 comments }

From Twitter platform to economy

by Adam Green on March 18, 2014

blog_home

in Future of Twitter

This is the first part of a series on the future of Twitter development: Part II, Part III.

Last week I took a road trip through Utah, across the Rocky Mountains, and into Boulder, Colorado. It gave me a chance to clear my head from Boston’s endless winter and contemplate my next steps for Twitter development. The route brought me past some of the most spectacular natural landscapes in the world, but my attention was also on the roads and towns. I could see patterns that weren’t as obvious in New England.

Back East the density of roads, businesses and residences blends into a solid mass. In the Southwest you can recognize the thin infrastructure of roads people have imposed on the terrain.

Towns along these roads with large clusters of businesses, such as Moab and Vail, show where there is a concentration of capital from tourism. The sudden, confined explosions of restaurants and retail stores made it clear that pumping tourist money into a town practically necessitates the growth of a marketplace.

I got to Boulder at the end of the road trip, my first visit in seven years. The growth of tech in that time is amazing; Pearl Street is now looking a lot like University Ave in Palo Alto. A culture that embraces entrepreneurship and an educated labor force have combined to create a solid tech economy.

Infrastructure, capital and labor. The pattern towards a robust economy plays out repeatedly before our eyes, and we’re now seeing it in the Twittersphere.

Twitter provided a new information highway, and made sure it ran everywhere. Businesses are now starting to invest and put up their billboards, in the form of $600 million spent on Twitter ads last year. This marketplace is only the beginning, a small fragment of the potential Twitter has to offer. The final, essential ingredient is a robust labor force of skilled API developers that will improve and expand upon the Twitter ad market. Once their power is unleashed, Twitter will move from a nascent marketplace to an economy of extraordinary scale.

US Southwest Ingredients Result Twitter
Highways and Roads Infrastructure Platform Twitter API and Servers
Tourist Towns Revenue Stream Marketplace Businesses Buying Ads
Boulder, CO Skilled Labor Force Economy API Developers

My next posts will explain how API developers can make money by building in this marketplace, and what the future could hold as they help produce a global economy.

{ 0 comments }

Installation note: All of the Twitter API tools are built around a set of common library files that must be downloaded and installed before running the code shown here. To run the scripts shown here, just copy them into the same directory as the common files. You can then run it as a web URL, from the command line of a Telnet or SSH client, or as a cron job.

The last API tool showed you how to collect all the members of any Twitter list. Today we will look at another useful technique for lists: collecting details on the lists owned by any account. This is done with the /lists/ownerships API call.

The list data collected by this tool will be stored in a database table called list_ownerships, so it can be used as input for other tools. The MySQL creation statement for this table is at the start of the script. You can copy this statement and paste it into the SQL box in phpMyAdmin.

Before looking at the code, here is some background. Every list on Twitter has a unique list id value, which should be stored as a 64-bit unsigned number. In MySQL terms, this is an unsigned Bigint data type. The other identifier for each list is the slug, which is the list name with spaces replaced by dashes. The combination of the list owner’s screen name and the slug is used to create the URL for the list. This script will also record the full list name with spaces included.

Lists can be either public or private, and this value is returned by the API in an element called mode. When requesting /lists/ownerships, you are given only the account’s public lists, unless the account whose OAuth tokens are being used is the list’s owner. Remember to preserve the privacy of all users. If you do collect data on a private list, you should never display it or share it with anyone.

list_ownerships.php

<?php
// Copy lists owned by a user to a database table
// Copyright (c) 2014 Adam Green. All rights reserved. 
// Contact info: http://140dev.com, @140dev, adam@140dev.com
// Released as open source under MIT license

/* Create this table to store lists
CREATE TABLE IF NOT EXISTS `list_ownerships` (
  `list_id` bigint(20) NOT NULL,
  `owner_screen_name` varchar(20) NOT NULL,
  `slug` varchar(100) NOT NULL,
  `name` varchar(100) NOT NULL,
  `created_at` datetime NOT NULL,
  `description` varchar(100) DEFAULT NULL,
  `mode` enum('public','private') NOT NULL,
  `members` int(11) NOT NULL,
  `subscribers` bigint(20) NOT NULL,
  PRIMARY KEY (`list_id`),
  KEY `owner_screen_name` (`owner_screen_name`),
  KEY `slug` (`slug`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
*/

// $owner_screen_name is a string with list owner's screen name
// $table_name is a string with name of DB table for users
// $clear_table is 1 to empty table first, 0 to leave intact
function list_ownerships($owner_screen_name, $table_name, $clear_table=0) {
  
  if (empty($owner_screen_name) || empty($table_name)) {
    print "ERROR: Invalid arguments";
    exit;
  }
    
  // Connect to the database
  require('db_lib.php');
  $oDB = new db;
  
  if ($clear_table) {
    // Clear the table of old entries for this owner
    $oDB->select("DELETE FROM $table_name
        WHERE owner_screen_name = '" . $oDB->escape($owner_screen_name) . "'");
  }

  // Connect to API with OAuth
  require('oauth_lib.php');
  $connection = get_connection(); 
  
  // Loop through pages of lists, each page has 20-1000 members
  // This is rate limited to 15 calls per 15 minute window
  // Start cursor at -1, end when cursor becomes 0
  $cursor = -1; 
  while ($cursor<>0) { 
    $connection->request('GET', $connection->url('1.1/lists/ownerships'), 
      array('screen_name' => $owner_screen_name,
        'count' => 100,  // Asking for too many lists can cause a timeout
        'cursor' => $cursor));
        
    // Exit on API error
    if ($connection->response['code'] <> 200) {
      print "ERROR: " . $connection->response['code'] . "\n";
      print $connection->response['response'];
    }
  
    $results = json_decode($connection->response['response']);
    $lists = $results->lists;
    foreach($lists as $list) {
      
      $list_id = $list->id_str;
      
      // Prevent duplicates
      if (!$oDB->in_table($table_name,"list_id=$list_id")) {
        
         // Escape string values that may contain quotes
         $field_values = "list_id = $list_id, " .
          "owner_screen_name = '" . $oDB->escape($owner_screen_name) . "', " .
          "slug = '" . $oDB->escape($list->slug) . "', " .
          "name = '" . $oDB->escape($list->name) . "', " .
          "description = '" . $oDB->escape($list->description) . "', " .
          "created_at = '" . $oDB->date($list->created_at) . "', " .
          "members = " . $list->member_count . ", " .
          "subscribers = " . $list->subscriber_count . ", " .
          "mode = '" . $list->mode . "'";
          
        $oDB->insert($table_name, $field_values);
      }
    }

    // Get the cursor for the next page of results
    $cursor = $results->next_cursor;
  }
}

?>

You can test this tool with list_ownerships_test.php, which is set to collect all the lists owned by the @twitter account.

list_ownerships_test.php

<?php

require('list_ownerships.php');
list_ownerships('twitter','list_ownerships',1);

?>

{ 0 comments }

Streaming API: Multi-level tweet collection databases

by Adam Green on February 14, 2014

blog_home

in Streaming API

Yesterday’s streaming API post described a multiple server model for handling high rate tweet collection. Today I’d like to cover a different architecture that addresses this problem with a single server running multiple databases.

Let’s say you want to display tweets for the most active stocks each day. The streaming API lets you collect tweets for 400 keywords, or in this case, the 400 most active stock symbols. That will be a high flow rate, and a large database to query if your site only needs to display tweets for 20 or 30 stocks at any one time.

A solution is to store all the tweets, users and related data you receive for all 400 stocks in one database, we’ll call it tweet_collect. You can then create a separate database, it can be called tweet_serve, and have your code copy just the tweets for active stocks to this database as they arrive. Your website only needs to read from tweet_serve, which will be much smaller and therefore deliver query results faster.

When a new stock becomes active, you will already have its tweets available in tweet_collect, so you can quickly copy its tweets to tweet_serve and be ready to display on the site. When the stock is no longer active, you can delete its data from tweet_serve.

The limitation of this technique is that you are limited topics that can be covered adequately within the limit of 400 keywords. As long as this fits your application needs, this model will produce a much faster website display.

When a keyword becomes active that isn’t in your normal collection list, you can fill in the data for this with the search API as needed. Search isn’t as powerful as streaming for large amounts of data, but if you need ad hoc collection of tweets for a few extra keywords, it does a good job. You can query it up to 720 times an hour and request tweets for about 10 to 15 keywords each time. These tweets would also go into the tweet_serve database.

{ 0 comments }

Twitter API Tools: Get list member user profiles

February 14, 2014

Installation note: All of the Twitter API tools are built around a set of common library files that must be downloaded and installed before running the code shown here. Twitter lists are a really underutilized feature, especially now that the limits have been raised to 1,000 lists, each with 5,000 members. I doubt if anyone […]

Read the full article →

Streaming API: Multiple server collection architecture

February 12, 2014

Now that I’ve upgraded the streaming API framework to make it easier to manage keyword tweet collection, the next step is handling the increased data flow that results from more keywords. One simple solution is to upgrade your server. MySQL loves as much RAM as it can be given, and switching to a solid state […]

Read the full article →

Twitter API Tools: Handling protected accounts

February 12, 2014

The previous post touched on some issues of protected accounts that should be pursued in more detail. Twitter’s rules for developers have two basic principles that apply here: Don’t surprise the user, and Respect user privacy. Both certainly apply to revealing data from a protected account. What becomes clear if you experiment with the code […]

Read the full article →

Twitter API Tools: Get user’s last tweet

February 12, 2014

Installation note: All of the Twitter API tools are built around a set of common library files that must be downloaded and installed before running the code shown here. For my first tool I’m going to keep it simple. This one should demonstrate my goals for these tools. They should be useful, single purpose, simple […]

Read the full article →

Twitter API Tools: Installing the common files

February 11, 2014

The previous post listed the set of common library files needed to use the Twitter API tools. Installing this package to get ready to use the tools is pretty easy: Download the zipped set of files and unzip them on your local machine. Create a web accessible directory on your server. You’ll want this to […]

Read the full article →

Twitter API Tools: Common library files

February 11, 2014

The collection of API tools will only need a few library files: db_lib.php, oauth_lib.php, and Matt Harris’ tmhOAuth library. These are all packaged together in a convenient zip file, ready for download. Here’s a summary of what you’ll find inside, and then we’ll look at the library code: cacert.pem – SSL certificate used by tmhOAuth […]

Read the full article →