<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>140dev</title>
	<atom:link href="http://140dev.com/feed/" rel="self" type="application/rss+xml" />
	<link>http://140dev.com</link>
	<description>Twitter API Programming Tips, Tutorials, Source Code Libraries and Consulting</description>
	<lastBuildDate>Wed, 31 Jul 2019 10:03:15 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.6</generator>
		<item>
		<title>140dev streaming API framework upgrade is required for the Phirehose library</title>
		<link>http://140dev.com/twitter-api-programming-blog/140dev-streaming-api-framework-upgrade-is-required-for-the-phirehose-library/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/140dev-streaming-api-framework-upgrade-is-required-for-the-phirehose-library/#comments</comments>
		<pubDate>Wed, 23 Apr 2014 10:44:52 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[140dev Source Code]]></category>
		<category><![CDATA[Announcements]]></category>
		<category><![CDATA[Phirehose]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=3032</guid>
		<description><![CDATA[The streaming API changed its behavior on April 22, breaking all the copies of the Phirehose library I have in use with my 140dev streaming API framework. Thankfully, a patched version of Phirehose was posted within a few hours. If you are running any copies of the 140dev framework, you need to replace your current [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p>The streaming API changed its behavior on April 22, breaking all the copies of the Phirehose library I have in use with my 140dev streaming API framework. Thankfully, a patched version of Phirehose was posted within a few hours. If you are running any copies of the 140dev framework, you need to replace your current copy of Phirehose.php with the <a href="https://github.com/fennb/phirehose/blob/master/lib/Phirehose.php">latest patched version</a>. If you are running the framework code, you will find your current copy of Phirehose.php in the /libraries/phirehose folder.</p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/140dev-streaming-api-framework-upgrade-is-required-for-the-phirehose-library/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>The Twitter economy will employ a diverse labor force</title>
		<link>http://140dev.com/twitter-api-programming-blog/the-twitter-economy-will-employ-a-diverse-labor-force/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/the-twitter-economy-will-employ-a-diverse-labor-force/#comments</comments>
		<pubDate>Thu, 20 Mar 2014 20:26:00 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[Future of Twitter]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=3015</guid>
		<description><![CDATA[This is the third part of a series on the future of Twitter development: Part I, Part II. In order for Twitter to reach everywhere, a skilled labor force of API developers is needed around the world. Developers make it possible to integrate Twitter into businesses in a more useful and personalized manner. This type [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p><em>This is the third part of a series on the future of Twitter development: <a href="/twitter-api-programming-blog/from-twitter-platform-to-economy/">Part I</a>, <a href="/twitter-api-programming-blog/following-the-twitter-ad-revenue-stream/">Part II</a>.</em></p>
<p>In order for Twitter to reach everywhere, a skilled labor force of API developers is needed around the world. Developers make it possible to integrate Twitter into businesses in a more useful and personalized manner. This type of integration will give Twitter ubiquity and longevity, two attributes that are almost impossible for competitors to overcome.</p>
<p>The API developer community is more complex than most realize. New people enter continually as students and self-taught programmers. Others come in as corporate developers who are told to work on Twitter projects.  As Twitter related companies in the top tier continue to grow multi-million dollar products, they bring in experienced coders from outside the Twitter world to manage big databases and build enterprise-level tools. This is a dynamic and growing group.</p>
<p style="width:340px;margin:0 auto;text-align:right;margin-bottom:18px;font-style:italic;"><img style="width:350px;" src="http://140dev.com/blog_images/labor_force.png" alt="Twitter Economy" /><br/>Developer Labor Force</p>
<p>One of the great strengths of the Twitter API is that it can be run by self-taught programmers who quickly turn themselves into productive tool builders. Self-taught programmers often come out of an actual need within a business or other organization. Software start-ups and consulting companies are born as a result of this group’s work.</p>
<p>Students are the other influx of talent I often see in the Twitter world. Doing a research project on Twitter data now seems to be a standard task for Computer Science students. That skill set will mean that Twitter will be an obvious place for this cohort to reach for when adding features to any system. It becomes a generation’s standard for social media data. </p>
<p>Marketing automation companies are doing so well that HubSpot has IPO buzz. Big Data companies based on Twitter are also booming and signing multi-million dollar partnerships. </p>
<p>So we have an established API infrastructure, a billion dollar ad revenue stream to work beside, and a rich set of tools built by a growing and diverse tech community. It sounds like <a href="http://140dev.com/twitter-api-programming-blog/from-twitter-platform-to-economy/">my road trip view of economics</a> is about to play out big-time with Twitter.</p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/the-twitter-economy-will-employ-a-diverse-labor-force/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Following the Twitter ad revenue stream</title>
		<link>http://140dev.com/twitter-api-programming-blog/following-the-twitter-ad-revenue-stream/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/following-the-twitter-ad-revenue-stream/#comments</comments>
		<pubDate>Wed, 19 Mar 2014 17:11:24 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[Future of Twitter]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=2996</guid>
		<description><![CDATA[This is the second part of a series on the future of Twitter development: Part I, Part III. Twitter ads are the fuel that will soon run the Twitter API development world. I spent a lot of time thinking about this on my recent road trip. Data from users interacting with Twitter ads can be [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p><em>This is the second part of a series on the future of Twitter development: <a href="/twitter-api-programming-blog/from-twitter-platform-to-economy/">Part I</a>, <a href="/twitter-api-programming-blog/the-twitter-economy-will-employ-a-diverse-labor-force/">Part III</a>.</em></p>
<p>Twitter ads are the fuel that will soon run the Twitter API development world. <a href="http://140dev.com/twitter-api-programming-blog/from-twitter-platform-to-economy/">I spent a lot of time thinking about this on my recent road trip.</a> Data from users interacting with Twitter ads can be expanded upon and optimized by API developers. This creates a win-win-win model: developers make money helping clients use Twitter ads more effectively, businesses get a better return on their investment and Twitter sells more ads to happy clients.</p>
<p>The current Twitter for Business marketing campaign is convincing businesses to spend increasing amounts of ad money on Twitter; $600 million last year and over one billion projected this year. You know what’s cool? Breaking that threshold and envisioning how the economics of Twitter ad buying will mature as billions of dollars start flowing.</p>
<p>Developers will be able to make money from this revenue stream by extending and optimizing Twitter ads for businesses. They will be the labor force that helps ad buyers leverage the leads and data they collect. For instance, once you have thousands of screen names and email addresses from Lead Generation Cards, you still need to engage with these people. Ad buyers could simply treat these leads as a normal mailing list, but that misses the whole formula of Twitter.</p>
<p><strong>Leads plus engagement produces relationships. Energy strategically applied to relationships produces communities of loyal customers and supporters.</strong></p>
<p>The integration of API developers with the Twitter ad buying cycle creates a more robust and innovative economy that benefits all parties.</p>
<p style="width:340px;margin:0 auto;text-align:right;margin-bottom:18px;font-style:italic;"><img style="width:350px;" src="http://140dev.com/blog_images/triangle_economy.png" alt="Twitter Economy" /><br/>Twitter Economy</p>
<p>We will have an economy where the financial interests of Twitter, businesses and developers are aligned. There will be large third party companies that sell general solutions, and smaller development teams who customize the results from Twitter’s API and Twitter ad data to provide vertical market solutions. I’ll describe this future API developer labor force in more detail tomorrow.</p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/following-the-twitter-ad-revenue-stream/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>From Twitter platform to economy</title>
		<link>http://140dev.com/twitter-api-programming-blog/from-twitter-platform-to-economy/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/from-twitter-platform-to-economy/#comments</comments>
		<pubDate>Tue, 18 Mar 2014 18:20:55 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[Future of Twitter]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=2971</guid>
		<description><![CDATA[This is the first part of a series on the future of Twitter development: Part II, Part III. Last week I took a road trip through Utah, across the Rocky Mountains, and into Boulder, Colorado. It gave me a chance to clear my head from Boston’s endless winter and contemplate my next steps for Twitter [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p><em>This is the first part of a series on the future of Twitter development: <a href="http://140dev.com/twitter-api-programming-blog/following-the-twitter-ad-revenue-stream/">Part II</a>, <a href="/twitter-api-programming-blog/the-twitter-economy-will-employ-a-diverse-labor-force/">Part III</a>.</em></p>
<p>Last week I took a road trip through Utah, across the Rocky Mountains, and into Boulder, Colorado. It gave me a chance to clear my head from Boston’s endless winter and contemplate my next steps for Twitter development. The route brought me past some of the most spectacular natural landscapes in the world, but my attention was also on the roads and towns. I could see patterns that weren’t as obvious in New England. </p>
<p>Back East the density of roads, businesses and residences blends into a solid mass. In the Southwest you can recognize the thin infrastructure of roads people have imposed on the terrain.</p>
<p>Towns along these roads with large clusters of businesses, such as Moab and Vail, show where there is a concentration of capital from tourism. The sudden, confined explosions of restaurants and retail stores made it clear that pumping tourist money into a town practically necessitates the growth of a marketplace.</p>
<p>I got to Boulder at the end of the road trip, my first visit in seven years. The growth of tech in that time is amazing; Pearl Street is now looking a lot like University Ave in Palo Alto. A culture that embraces entrepreneurship and an educated labor force have combined to create a solid tech economy. </p>
<p><strong>Infrastructure, capital and labor.</strong> The pattern towards a robust economy plays out repeatedly before our eyes, and we’re now seeing it in the Twittersphere.</p>
<p>Twitter provided a new information highway, and made sure it ran everywhere. Businesses are now starting to invest and put up their billboards, in the form of $600 million spent on Twitter ads last year. This marketplace is only the beginning, a small fragment of the potential Twitter has to offer. The final, essential ingredient is a robust labor force of skilled API developers that will improve and expand upon the Twitter ad market. Once their power is unleashed, Twitter will move from a nascent marketplace to an economy of extraordinary scale.</p>
<style type="text/css">
.tg  {border-collapse:collapse;border-spacing:0;border-color:#999;margin-bottom:20px;text-align:center;}
.tg td{font-family:Arial, sans-serif;font-size:13px;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#999;color:#444;background-color:#F7FDFA;}
.tg th{font-family:Arial, sans-serif;font-size:13px;font-weight:normal;padding:10px 5px;border-style:solid;border-width:1px;overflow:hidden;word-break:normal;border-color:#999;color:#fff;background-color:#26ADE4;}
.tg .tg-e3zv{font-weight:bold;min-width:144px;}
</style>
<table class="tg">
<tr>
<th class="tg-e3zv">US Southwest</th>
<th class="tg-e3zv">Ingredients</th>
<th class="tg-e3zv">Result</th>
<th class="tg-e3zv">Twitter</th>
</tr>
<tr>
<td class="tg-031e">Highways and Roads</td>
<td class="tg-031e">Infrastructure</td>
<td class="tg-031e">Platform</td>
<td class="tg-031e">Twitter API and Servers</td>
</tr>
<tr>
<td class="tg-031e">Tourist Towns</td>
<td class="tg-031e">Revenue Stream</td>
<td class="tg-031e">Marketplace</td>
<td class="tg-031e">Businesses Buying Ads</td>
</tr>
<tr>
<td class="tg-031e">Boulder, CO</td>
<td class="tg-031e">Skilled Labor Force</td>
<td class="tg-031e">Economy</td>
<td class="tg-031e">API Developers</td>
</tr>
</table>
<p>My <a href="http://140dev.com/twitter-api-programming-blog/following-the-twitter-ad-revenue-stream/">next posts</a> will explain how API developers can make money by building in this marketplace, and what the future could hold as they help produce a global economy. </p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/from-twitter-platform-to-economy/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>The inevitable path to vertical Twitter apps</title>
		<link>http://140dev.com/twitter-api-programming-blog/the-inevitable-path-to-vertical-twitter-apps/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/the-inevitable-path-to-vertical-twitter-apps/#comments</comments>
		<pubDate>Wed, 26 Feb 2014 12:29:07 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[The future of Twitter]]></category>
		<category><![CDATA[Vertical Twitter Apps]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=2949</guid>
		<description><![CDATA[We are entering the next phase of Twitter&#8217;s maturity. With the IPO completed, it now has enough market validation and cash to be accepted as a valid advertising vehicle by major brands and the rapidly evolving Social TV industry. These are great targets for advertising, but there is another market that I see emerging over [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p>We are entering the next phase of Twitter&#8217;s maturity. With the IPO completed, it now has enough market validation and cash to be accepted as a valid advertising vehicle by major brands and the rapidly evolving Social TV industry. These are great targets for advertising, but there is another market that I see emerging over the next couple of years. I call this Vertical Twitter. </p>
<p>There are some features that all Twitter business users and advertisers need: targeted leads, high click-thru rates on URLs in tweets, and high numbers of quality followers. There are now good third-party tools for handling these generalized marketing needs, and Twitter&#8217;s own tools, such as the ads API and Cards, also do a great job, but I see Vertical Twitter as extending beyond marketing. The integration of API based technology into a company&#8217;s core business and sales functions will be the target of this next step in the Twitter story. </p>
<p>The integration of new computer technology has repeatedly followed the path from techie enthusiasts, business pioneers proselytizing in their workplaces, and eventual acceptance by businesses as a core part of their IT infrastructure. This evolution always starts with marketing and progresses to custom solutions for specific industries. We have seen it with productivity tools from 1980 to 1994, websites and blogs from 1995 to 2005, and now it is playing out in social networks from 2006 to the present. In each era products evolve from solving the simplest, most generic tasks into vertical applications aimed at the specific needs of realtors, financial analysts, doctors, lawyers, farmers, politicians, etc. </p>
<p>Twitter is uniquely positioned to lead social networking into vertical market applications. The combination of the Twitter API and hundreds of thousands of API developers gives Twitter the means and labor force to make this possible. You have probably been getting the same Twitter for Business emails I have from Twitter. There is no doubt that Twitter&#8217;s management also recognizes the value of small and medium businesses in selling advertising. Once these business understand Twitter better as a marketing vehicle, the demand for vertical apps that address other aspects of their business is inevitable. </p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/the-inevitable-path-to-vertical-twitter-apps/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Twitter API Tools: Get lists owned by a Twitter account</title>
		<link>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-lists-owned-by-a-twitter-account/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-lists-owned-by-a-twitter-account/#comments</comments>
		<pubDate>Mon, 17 Feb 2014 16:14:09 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[140dev Source Code]]></category>
		<category><![CDATA[Twitter API Tools]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=2935</guid>
		<description><![CDATA[Installation note: All of the Twitter API tools are built around a set of common library files that must be downloaded and installed before running the code shown here. To run the scripts shown here, just copy them into the same directory as the common files. You can then run it as a web URL, [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p><strong>Installation note:</strong> All of the Twitter API tools are built around a set of common library files that must be <a href="http://140dev.com/twitter-api-programming-blog/twitter-api-tools-installing-the-common-files/">downloaded and installed</a> before running the code shown here. To run the scripts shown here, just copy them into the same directory as the common files. You can then run it as a web URL, from the command line of a Telnet or SSH client, or as a cron job. </p>
<p>The <a href="http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-list-member-user-profiles/">last API tool</a> showed you how to collect all the members of any Twitter list. Today we will look at another useful technique for lists: collecting details on the lists owned by any account. This is done with the <a href="https://dev.twitter.com/docs/api/1.1/get/lists/ownerships">/lists/ownerships</a> API call. </p>
<p>The list data collected by this tool will be stored in a database table called <strong>list_ownerships</strong>, so it can be used as input for other tools. The MySQL creation statement for this table is at the start of the script. You can copy this statement and paste it into the SQL box in phpMyAdmin. </p>
<p>Before looking at the code, here is some background. Every list on Twitter has a unique list id value, which should be stored as a 64-bit unsigned number. In MySQL terms, this is an unsigned Bigint data type. The other identifier for each list is the slug, which is the list name with spaces replaced by dashes. The combination of the list owner&#8217;s screen name and the slug is used to create the URL for the list. This script will also record the full list name with spaces included. </p>
<p>Lists can be either public or private, and this value is returned by the API in an element called <strong>mode</strong>. When requesting /lists/ownerships, you are given only the account&#8217;s public lists, unless the account whose OAuth tokens are being used is the list&#8217;s owner. Remember to preserve the privacy of all users. If you do collect data on a private list, you should never display it or share it with anyone. </p>
<p><strong>list_ownerships.php</strong></p>
<pre>&lt;?php
// Copy lists owned by a user to a database table
// Copyright (c) 2014 Adam Green. All rights reserved. 
// Contact info: http://140dev.com, @140dev, adam@140dev.com
// Released as open source under MIT license

/* Create this table to store lists
CREATE TABLE IF NOT EXISTS `list_ownerships` (
  `list_id` bigint(20) NOT NULL,
  `owner_screen_name` varchar(20) NOT NULL,
  `slug` varchar(100) NOT NULL,
  `name` varchar(100) NOT NULL,
  `created_at` datetime NOT NULL,
  `description` varchar(100) DEFAULT NULL,
  `mode` enum('public','private') NOT NULL,
  `members` int(11) NOT NULL,
  `subscribers` bigint(20) NOT NULL,
  PRIMARY KEY (`list_id`),
  KEY `owner_screen_name` (`owner_screen_name`),
  KEY `slug` (`slug`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
*/

// $owner_screen_name is a string with list owner's screen name
// $table_name is a string with name of DB table for users
// $clear_table is 1 to empty table first, 0 to leave intact
function list_ownerships($owner_screen_name, $table_name, $clear_table=0) {
  
  if (empty($owner_screen_name) || empty($table_name)) {
    print "ERROR: Invalid arguments";
    exit;
  }
    
  // Connect to the database
  require('db_lib.php');
  $oDB = new db;
  
  if ($clear_table) {
    // Clear the table of old entries for this owner
    $oDB->select("DELETE FROM $table_name
        WHERE owner_screen_name = '" . $oDB->escape($owner_screen_name) . "'");
  }

  // Connect to API with OAuth
  require('oauth_lib.php');
  $connection = get_connection(); 
  
  // Loop through pages of lists, each page has 20-1000 members
  // This is rate limited to 15 calls per 15 minute window
  // Start cursor at -1, end when cursor becomes 0
  $cursor = -1; 
  while ($cursor<>0) { 
    $connection->request('GET', $connection->url('1.1/lists/ownerships'), 
      array('screen_name' => $owner_screen_name,
        'count' => 100,  // Asking for too many lists can cause a timeout
        'cursor' => $cursor));
        
    // Exit on API error
    if ($connection->response['code'] <> 200) {
      print "ERROR: " . $connection->response['code'] . "\n";
      print $connection->response['response'];
    }
  
    $results = json_decode($connection->response['response']);
    $lists = $results->lists;
    foreach($lists as $list) {
      
      $list_id = $list->id_str;
      
      // Prevent duplicates
      if (!$oDB->in_table($table_name,"list_id=$list_id")) {
        
         // Escape string values that may contain quotes
         $field_values = "list_id = $list_id, " .
          "owner_screen_name = '" . $oDB->escape($owner_screen_name) . "', " .
          "slug = '" . $oDB->escape($list->slug) . "', " .
          "name = '" . $oDB->escape($list->name) . "', " .
          "description = '" . $oDB->escape($list->description) . "', " .
          "created_at = '" . $oDB->date($list->created_at) . "', " .
          "members = " . $list->member_count . ", " .
          "subscribers = " . $list->subscriber_count . ", " .
          "mode = '" . $list->mode . "'";
          
        $oDB->insert($table_name, $field_values);
      }
    }

    // Get the cursor for the next page of results
    $cursor = $results->next_cursor;
  }
}

?&gt;</pre>
<p>You can test this tool with list_ownerships_test.php, which is set to collect all the lists owned by the <a href="http://twitter.com/twitter">@twitter</a> account.</p>
<p><strong>list_ownerships_test.php</strong></p>
<pre>&lt;?php

require('list_ownerships.php');
list_ownerships('twitter','list_ownerships',1);

?&gt;</pre>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-lists-owned-by-a-twitter-account/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Streaming API: Multi-level tweet collection databases</title>
		<link>http://140dev.com/twitter-api-programming-blog/streaming-api-multi-level-tweet-collection-databases/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/streaming-api-multi-level-tweet-collection-databases/#comments</comments>
		<pubDate>Fri, 14 Feb 2014 01:13:40 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[Streaming API]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=2927</guid>
		<description><![CDATA[Yesterday&#8217;s streaming API post described a multiple server model for handling high rate tweet collection. Today I&#8217;d like to cover a different architecture that addresses this problem with a single server running multiple databases. Let&#8217;s say you want to display tweets for the most active stocks each day. The streaming API lets you collect tweets [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p>Yesterday&#8217;s <a href="http://140dev.com/twitter-api-programming-blog/streaming-api-multiple-server-collection-architecture/">streaming API post</a> described a multiple server model for handling high rate tweet collection. Today I&#8217;d like to cover a different architecture that addresses this problem with a single server running multiple databases. </p>
<p>Let&#8217;s say you want to display tweets for the most active stocks each day. The streaming API lets you collect tweets for 400 keywords, or in this case, the 400 most active stock symbols. That will be a high flow rate, and a large database to query if your site only needs to display tweets for 20 or 30 stocks at any one time. </p>
<p>A solution is to store all the tweets, users and related data you receive for all 400 stocks in one database, we&#8217;ll call it tweet_collect. You can then create a separate database, it can be called tweet_serve, and have your code copy just the tweets for active stocks to this database as they arrive. Your website only needs to read from tweet_serve, which will be much smaller and therefore deliver query results faster. </p>
<p>When a new stock becomes active, you will already have its tweets available in tweet_collect, so you can quickly copy its tweets to tweet_serve and be ready to display on the site. When the stock is no longer active, you can delete its data from tweet_serve. </p>
<p>The limitation of this technique is that you are limited topics that can be covered adequately within the limit of 400 keywords. As long as this fits your application needs, this model will produce a much faster website display. </p>
<p>When a keyword becomes active that isn&#8217;t in your normal collection list, you can fill in the data for this with the search API as needed. Search isn&#8217;t as powerful as streaming for large amounts of data, but if you need ad hoc collection of tweets for a few extra keywords, it does a good job. You can query it up to 720 times an hour and request tweets for about 10 to 15 keywords each time. These tweets would also go into the tweet_serve database. </p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/streaming-api-multi-level-tweet-collection-databases/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Twitter API Tools: Get list member user profiles</title>
		<link>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-list-member-user-profiles/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-list-member-user-profiles/#comments</comments>
		<pubDate>Fri, 14 Feb 2014 00:10:49 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[140dev Source Code]]></category>
		<category><![CDATA[Twitter API Tools]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=2924</guid>
		<description><![CDATA[Installation note: All of the Twitter API tools are built around a set of common library files that must be downloaded and installed before running the code shown here. Twitter lists are a really underutilized feature, especially now that the limits have been raised to 1,000 lists, each with 5,000 members. I doubt if anyone [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p><strong>Installation note:</strong> All of the Twitter API tools are built around a set of common library files that must be <a href="http://140dev.com/twitter-api-programming-blog/twitter-api-tools-installing-the-common-files/">downloaded and installed</a> before running the code shown here.</p>
<p>Twitter lists are a really underutilized feature, especially now that the limits have been raised to 1,000 lists, each with 5,000 members. I doubt if anyone will need 5,000,000 list members, but there are more uses for lists than most people realize. I think lists make a great way for users to generate a set of accounts that can be used by other apps. </p>
<p>Instead of building an entire user interface by hand to allow users to input a set Twitter accounts for your apps, you can let them build and manage a list in Twitter, and then read that list as input for your processing. Another benefit is that users can share their lists with others. A team can collaborate through lists by creating a Twitter account and sharing the login, so they can all edit the lists. </p>
<p>Today&#8217;s tool reads any public Twitter list from any account, and adds complete account profiles for the members to a database. It allows you to collect multiple lists into a single table, while keeping their original list identity. It also clears the members of any list from the table as an option, so you can refresh the database copy when users change the list. </p>
<p>Since this tool needs a database table for storage, I have included the MySQL creation statement at the start of the script. I&#8217;m going to keep this model for all the tools. It also serves as useful documentation. You can copy this statement and paste it into the SQL box in phpMyAdmin. </p>
<p><strong>list_members.php</strong></p>
<pre>&lt;?php
// Copy list members to a database table
// Copyright (c) 2014 Adam Green. All rights reserved. 
// Contact info: http://140dev.com, @140dev, adam@140dev.com
// Released as open source under MIT license

/* Create this table to store list members
CREATE TABLE IF NOT EXISTS `list_members` (
  `owner_screen_name` varchar(20) NOT NULL,
  `slug` varchar(100) NOT NULL,
  `user_id` bigint(20) unsigned NOT NULL,
  `screen_name` varchar(20) NOT NULL,
  `name` varchar(40) DEFAULT NULL,
  `profile_image_url` varchar(200) DEFAULT NULL,
  `location` varchar(30) DEFAULT NULL,
  `url` varchar(200) DEFAULT NULL,
  `description` varchar(200) DEFAULT NULL,
  `created_at` datetime NOT NULL,
  `followers_count` int(10) unsigned DEFAULT NULL,
  `friends_count` int(10) unsigned DEFAULT NULL,
  `statuses_count` int(10) unsigned DEFAULT NULL,
  `protected` tinyint(1) NOT NULL,
  PRIMARY KEY (`owner_screen_name`,`slug`,`user_id`),
  KEY `screen_name` (`screen_name`),
  KEY `owner_screen_name` (`owner_screen_name`),
  KEY `slug` (`slug`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
*/

// Arguments for list_members()
// $owner_screen_name is a string with account screen name
// $slug is a string with list name, spaces are replaced with - 
// For example: https://twitter.com/twitter/lists/official-twitter-accts
// $owner_screen_name = twitter
// $slug = official-twitter-accts

// $table_name is a string with name of DB table for users
// $clear_table is an optional value, if set to 1 the table is truncated first
function list_members($owner_screen_name, $slug, $table_name, $clear_table=0) {
	
  if (empty($owner_screen_name) || empty($slug) || empty($table_name)) {
    print "ERROR: Invalid arguments";
    exit;
  }
		
  // Connect to the database
  require('db_lib.php');
  $oDB = new db;
	
  if ($clear_table) {
    // Clear the table of old entries for this list
    $oDB->select("DELETE FROM $table_name
      WHERE owner_screen_name = '" . $oDB->escape($owner_screen_name) . "' " .
      "AND slug = '" . $oDB->escape($slug) . "'");
  }

  // Connect to API with OAuth
  require('oauth_lib.php');
  $connection = get_connection();	
	
  // Loop through pages of members, each page has 20 members
  // This is rate limited to 480 calls per 15 minute window
  // The total is 9,600 members, more than the 5,000 member limit
	
  // Start cursor at -1, end when cursor becomes 0
  $cursor = -1;	
  while ($cursor<>0) { 
    $connection->request('GET', $connection->url('1.1/lists/members'), 
      array('owner_screen_name' => $owner_screen_name,
      'slug' => $slug,
      'include_entities' => false,  // We don't need user's entities
      'skip_status' => true, // We don't need user's last tweet
      'cursor' => $cursor));
				
    // Exit on API error
    if ($connection->response['code'] <> 200) {
      print "ERROR: " . $connection->response['code'] . "\n";
      print $connection->response['response'];
    }
	
    // Escape string values that may contain quotes
    $owner_screen_name = $oDB->escape($owner_screen_name);
    $slug = $oDB->escape($slug);
		
    $results = json_decode($connection->response['response']);
    $users = $results->users;
    foreach($users as $user) {
			
      $screen_name = $oDB->escape($user->screen_name);
			
      // Prevent duplicates
      $where = "owner_screen_name = '$owner_screen_name' 
        AND slug = '$slug' AND screen_name = '$screen_name'";
      if (!$oDB->in_table($table_name,$where)) {
				
        // If user is not protected
        if (empty($user->protected)) {
          $protected = 0;
        } else {
          $protected = 1;
        }				
				
        $field_values = "owner_screen_name = '$owner_screen_name', " .
          "slug = '$slug', " .
          "user_id = " . $user->id_str . ", " .
          "screen_name = '$screen_name', name = '" . $oDB->escape($user->name) . "', " .
          "profile_image_url = '" . $user->profile_image_url . "', " .
          "location = '" . $user->location . "', " .
          "url = '" . $user->url . "', " .
          "description = '" . $oDB->escape($user->description) . "', " .
          "created_at = '" . $oDB->date($user->created_at) . "', " .
          "followers_count = " . $user->followers_count . ", " .
          "friends_count = " . $user->friends_count . ", " .
          "statuses_count = " . $user->statuses_count . ', ' . 
          "protected = $protected";
					
         $oDB->insert($table_name, $field_values);
       }
    }

    // Get the cursor for the next page of results
    $cursor = $results->next_cursor;
  }
}

?&gt;</pre>
<p>You can test this tool with list_members_test.php, which is set to read the members of this list: <a href="https://twitter.com/twitter/lists/twitter-engineering">https://twitter.com/twitter/lists/twitter-engineering</a>.</p>
<p><strong>list_members_test.php</strong></p>
<pre>&lt;?php

require('list_members.php');
list_members('twitter','twitter-engineering','list_members',1);

?&gt;</pre>
<p>One other point. This will read public lists from any account, and can also read a private list owned by the account whose OAuth tokens are used for making the API request. </p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-list-member-user-profiles/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Streaming API: Multiple server collection architecture</title>
		<link>http://140dev.com/twitter-api-programming-blog/streaming-api-multiple-server-collection-architecture/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/streaming-api-multiple-server-collection-architecture/#comments</comments>
		<pubDate>Wed, 12 Feb 2014 21:33:58 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[Server configuration]]></category>
		<category><![CDATA[Streaming API]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=2912</guid>
		<description><![CDATA[Now that I&#8217;ve upgraded the streaming API framework to make it easier to manage keyword tweet collection, the next step is handling the increased data flow that results from more keywords. One simple solution is to upgrade your server. MySQL loves as much RAM as it can be given, and switching to a solid state [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p>Now that I&#8217;ve <a href="http://140dev.com/twitter-api-programming-blog/streaming-api-keyword-collection-enhancements-part-1/">upgraded</a> the <a href="http://140dev.com/free-twitter-api-source-code-library/">streaming API framework</a> to make it easier to manage keyword tweet collection, the next step is handling the increased data flow that results from more keywords. One simple solution is to upgrade your server. MySQL loves as much RAM as it can be given, and switching to a solid state drive is another fix that I highly recommend. But building one monstrous server may not be the most cost effective solution, especially if you are operating &#8220;in the cloud&#8221;. Cloud servers get really expensive when you try to load up lots of RAM. </p>
<p>An alternative solution that should be considered is to distribute your tweet collection across more than one server, each of which may not be that powerful. The result is often more bang for the buck. I&#8217;m going to cover some possible multiple server architectures that I&#8217;ve built for various projects over the past few years. </p>
<p>One solution is to dedicate one server to tweet collection, and another to data mining and data processing. I tend to call the first one the collection server, and the second the db server. In terms of my streaming API code, I would put a database with just the json_cache table on the collection server. The only code running on this machine would be get_tweets.php, which writes new tweets to its copy of json_cache. The db server would have the complete database schema, including its own copy of json_cache. It would run parse_tweets.php and any other database code you need, such as queries for a web interface to display the tweets.</p>
<p>The goal is to only give the db server as many new tweets as it can handle while maintaining good parsing and query performance. This can be done by a script that copies new tweets from json_cache on the collection server to json_cache on the db server, then deletes these tweets from the collection server. The db server would parse the new tweets it finds in its copy of json_cache, just the way it normally does. The nice thing is that other than the code to transfer tweets between servers, none of the other code changes. </p>
<p>In effect the collection server is now a buffer, holding new tweets as they arrive from the streaming API and protecting the db server from being crushed by too high a flow, or a sudden burst. The tweet transfer rate from collection server to db server can be managed by a timetable that transfers more tweets at night when the db server is unlikely to be running user requests. During the day the amount of tweets stored on the collection server would rise, if the flow was too fast to parse. Then at night the higher transfer rate would draw down the buffer. </p>
<p>For maximum performance and minimum cost, you have to make sure the two servers can communicate through the webhost&#8217;s internal network. You don&#8217;t want to pay for bandwidth costs to move this data across the public internet, which would also be a lot slower. </p>
<p>The benefit of this model is that as long as you only transfer new tweets to the db server at a rate it can handle, you are guaranteed an acceptable level of performance. A sudden trending topic or other increase in flow would impact the collection server, but have no effect on the db server. You don&#8217;t have to build up the db server&#8217;s hardware to handle the largest possible burst. That can save money, even with the addition of the collection server. The collection server can be kept small, since all it does is grab tweets from the API and insert them into json_cache. </p>
<p>The obvious downside of this architecture is that there would be a lag between the time tweets arrived from the API and when they were available for queries on the db server. This is fine for an application that did long-term analysis, but may not be acceptable for a site that needs to display new tweets in real time. </p>
<p>I&#8217;ll cover other possible server architectures in future posts that can fit different application requirements. </p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/streaming-api-multiple-server-collection-architecture/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Twitter API Tools: Handling protected accounts</title>
		<link>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-handling-protected-accounts/</link>
		<comments>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-handling-protected-accounts/#comments</comments>
		<pubDate>Wed, 12 Feb 2014 14:58:02 +0000</pubDate>
		<dc:creator>Adam Green</dc:creator>
				<category><![CDATA[140dev Source Code]]></category>
		<category><![CDATA[Twitter API Tools]]></category>

		<guid isPermaLink="false">http://140dev.com/?p=2909</guid>
		<description><![CDATA[The previous post touched on some issues of protected accounts that should be pursued in more detail. Twitter&#8217;s rules for developers have two basic principles that apply here: Don&#8217;t surprise the user, and Respect user privacy. Both certainly apply to revealing data from a protected account. What becomes clear if you experiment with the code [&#8230;]]]></description>
				<content:encoded><![CDATA[<p></p><p>The <a href="http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-users-last-tweet/">previous post</a> touched on some issues of protected accounts that should be pursued in more detail. Twitter&#8217;s <a href="https://dev.twitter.com/terms/api-terms">rules for developers</a> have two basic principles that apply here: Don&#8217;t surprise the user, and Respect user privacy. Both certainly apply to revealing data from a protected account. </p>
<p>What becomes clear if you experiment with the code in the <a href="http://140dev.com/twitter-api-programming-blog/twitter-api-tools-get-users-last-tweet/">user_tweet.ph</a>p tool is that it is possible to see the last tweet of a protected account. This happens when you request a protected account with OAuth tokens from that same account. This can occur in two ways. You can be using single-user OAuth and then call user_tweet() with that user&#8217;s user_id or screen_name.   Or you can have a multi-user login system, and call user_tweet() with the tokens for the same user you are asking about. </p>
<p>The important point is that just because you get back a <strong>$response->status</strong> element in the API response, you can&#8217;t assume it is OK to display it or store it in a database. You must always check the <strong>$response->protected</strong> element first. If that is not empty, or has a value of true, then the account is protected, and you should ignore any tweet data delivered by the API. </p>
<p>I understand why Twitter coded the response for protected accounts the way they did. Their assumption is that when you authorize with a user&#8217;s tokens, it is OK to give you all their data. It is your responsibility to not accidentally reveal this data. The safe rule is: <strong>If the account is protected, don&#8217;t look at the tweets.</strong> </p>
]]></content:encoded>
			<wfw:commentRss>http://140dev.com/twitter-api-programming-blog/twitter-api-tools-handling-protected-accounts/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
