4elements, Amsterdam, Holland

  1. Building With the Twitter API: OAuth, Reading and Posting

    Final product image
    What You'll Be Creating

    This post is the second of a three part series on using the Twitter API. If you missed part one, you can read it here.

    Authenticating With Twitter via OAuth

    Birdcage uses a Yii extension called Yii Twitter by Will Wharton, which makes use of the open-source PHP OAuth Twitter library by Abraham Williams.

    I place the extension in the Yii tree under /app/protected/extensions/yiitwitteroauth. In Yii, you configure extension properties in the main.php configuration file like so:

        // application components
    	'components'=>array(
    	  'twitter' => array(
        				'class' => 'ext.yiitwitteroauth.YiiTwitter',
        				'consumer_key' => '',
        				'consumer_secret' => '',
        				'callback' => '',
        			),
    

    Normally, I'd load these settings from my Yii .ini file, but to make the Birdcage setup simpler, I'm configuring the application keys from the UserSettings model. I've extended YiiTwitter.php to load the default user's application keys during initialization:

        public function init() {
    	  // load twitter app keys from UserSetting table
        $result = UserSetting::model()->loadPrimarySettings();
        $this->consumer_key = $result['twitter_key'];
        $this->consumer_secret = $result['twitter_secret'];
        $this->callback = $result['twitter_url'];
    		$this->registerScripts();
    		parent::init();	
    	}	
    

    Once you've installed and configured the application settings, you'll need to visit the Accounts menu and click Add Your Twitter Account

    Mange Accounts in the Birdcage application

    When you click on the Twitter icon, it executes the Connect method of the Birdcage Twitter controller:

      public function actionConnect()
           {
            unset(Yii::app()->session['account_id']);
            Yii::app()->session['account_id']=$_GET['id'];
           $twitter = Yii::app()->twitter->getTwitter();  
           $request_token = $twitter->getRequestToken();
           //set some session info
           Yii::app()->session['oauth_token'] = $token =$request_token['oauth_token'];
           Yii::app()->session['oauth_token_secret'] = $request_token['oauth_token_secret'];
    
              if ($twitter->http_code == 200) {
                  //get twitter connect url
                  $url = $twitter->getAuthorizeURL($token);
                  //send them              
                  Yii::app()->request->redirect($url);
              }else{
                  //error here
                  $this->redirect(Yii::app()->homeUrl);
              }
          }

    This will take you back to Twitter via OAuth to authenticate your Twitter account:

    Twitter OAuth Challenge Screen

    Once you've logged in, Twitter will ask you to authorize the Birdcage application:

    Authorize app for Twitter API

    Twitter will return the browser to your callback URL, our Twitter Controller Callback method. It will save your Twitter user OAuth token and secret in the account table:

      public function actionCallback() {
        /* If the oauth_token is old redirect to the connect page. */
                if (isset($_REQUEST['oauth_token']) && Yii::app()->session['oauth_token'] !== $_REQUEST['oauth_token']) {
                    Yii::app()->session['oauth_status'] = 'oldtoken';
                }
    /* Create TwitteroAuth object with app key/secret and token key/secret from default phase */
                $twitter = Yii::app()->twitter->getTwitterTokened(Yii::app()->session['oauth_token'], Yii::app()->session['oauth_token_secret']);   
                /* Request access tokens from twitter */
                $access_token = $twitter->getAccessToken($_REQUEST['oauth_verifier']);
          /* Save the access tokens. Normally these would be saved in a database for future use. */            
                Yii::app()->session['access_token'] = $access_token;
                $account = Account::model()->findByAttributes(array('user_id'=>Yii::app()->user->id,'id'=>Yii::app()->session['account_id']));
                $account['oauth_token'] = $access_token['oauth_token'];
                $account['oauth_token_secret'] = $access_token['oauth_token_secret'];
    $account->save();
                
                /* Remove no longer needed request tokens */
                unset(Yii::app()->session['oauth_token']);
                unset(Yii::app()->session['oauth_token_secret']);
    
                if (200 == $twitter->http_code) {
              /* The user has been verified and the access tokens can be saved for future use */
                    Yii::app()->session['status'] = 'verified';
                    $this->redirect(array('account/admin'));
    
                } else {
                    /* Save HTTP status for error dialog on connnect page.*/
                    //header('Location: /clearsessions.php');
                    $this->redirect(Yii::app()->homeUrl);
                } 
           }
    

    Now, Birdcage is ready to begin making requests for Twitter data via the API on behalf of your user account.

    As you'll see ahead, a simple call with the tokens allows access to the API:

    $twitter = Yii::app()->twitter->getTwitterTokened($account['oauth_token'], $account['oauth_token_secret']);
    

    Processing Tweets in the Background

    For part two of our tutorial, we're using the Twitter REST API. Part three will delve into the real-time, always-on Streaming API:

    Using the Twitter REST API

    Retrieving Twitter Timelines

    Twitter timelines are an ever-expanding stack of tweets, so monitoring activity is a bit more complicated than your average REST API. You can learn more about the unique problem Twitter timelines present here. Essentially, as you're trying to read the timeline history, more tweets are being added all the time:

    The Ever Expanding Twitter Timeline

    Twitter provides a relatively simple way to manage this. You can follow the code that performs this in Birdcage's Tweet model, getRecentTweets().

    First, we look up the highest (most recent) tweet_id in our database, and return an incremented value:

      public function getLastTweet($account_id) {
        // get highest tweet_it where account_id = $account_id
        $criteria=new CDbCriteria;
        $criteria->select='max(tweet_id) AS max_tweet_id';
        $criteria->condition="account_id = ".$account_id;
        $row = Tweet::model()->find($criteria);
        if ($row['max_tweet_id'] ==0)
          return 1;
        else
          return $row['max_tweet_id']+1;
      }

    Then, we request some number (e.g. 100) of tweets since the highest previously processed one. The Twitter API recognizes the since_id as a pointer to the place in the timeline you wish to start retrieving from. It will return all the tweets more recent than since_id. In the example below, we're querying the REST API statuses/home_timeline method. The home timeline is what a user sees on their main Twitter screen.

        $since_id = $this->getLastTweet($account->id);
        echo 'since: '.$since_id;lb();
        // retrieve tweets up until that last stored one
        $tweets= $twitter->get("statuses/home_timeline",array('count'=>100,'since_id'=>$since_id)); 
        if (count($tweets)==0) return false; // nothing returned
    

    It's also important to check if we've been rate limited by Twitter. Each application-user request is allowed 180 requests to a user's home timeline per 15-minute window—but rate limits vary by activity, so you're programming mileage may vary.

    For each tweet received, we call our Parse() method to process the data and store it in our various database tables. During the process, we track the oldest/lowest tweet_id that we've received from Twitter:

        foreach ($tweets as $i) {
          if ($low_id==0)
            $low_id = intval($i->id_str);
          else
            $low_id = min(intval($i->id_str),$low_id);
          Tweet::model()->parse($account->id,$i);
        }

    The parse method adds the referenced Twitter user information and then the tweet itself. There's more detail in the Parse.php model.

      public function parse($account_id,$tweet) {
          // add user
          $tu = TwitterUser::model()->add($tweet->user);
          // add tweet
          $tweet_obj = $this->add($account_id,$tweet);

    Then, we continue to request blocks of tweets using the lowest ID from the last request as a max_id parameter which we send to Twitter. We make these subsequent requests using the since_id of the tweet we began with and the max_id from the last oldest tweet we retrieved.

        // retrieve next block until our code limit reached
        while ($count_tweets <= $limit) {
          lb(2);
          $max_id = $low_id-1;
          $tweets= $twitter->get("statuses/home_timeline",array('count'=>100,'max_id'=>$max_id,'since_id'=>$since_id));
          if (count($tweets)==0) break;
          if ($this->isRateLimited($tweets)) return false;
          echo 'count'.count($tweets);lb();
          $count_tweets+=count($tweets);
          foreach ($tweets as $i) {
            $low_id = min(intval($i->id_str),$low_id);
            Tweet::model()->parse($account->id,$i);
          }              
        }

    So, for example, as newer tweets come in, we don't see them—because Twitter is only sending us tweets since the initial highest tweet_id (since_id) from our database. We'll have to come back later to get newer tweets, which are higher than our initial since_id.

    It's important to note that we do not receive an infinite number of older tweets. Twitter returns to us only the number of tweets we request that are older than the previous low ID or (max_id in our next call).

    Once you get used to the model and nomenclature, it's quite simple.

    While there is a Fetch menu command which will run this operation, we also configure a cron job to call our DaemonController method every five minutes:

    # To define the time you can provide concrete values for
    # minute (m), hour (h), day of month (dom), month (mon),
    # and day of week (dow) or use '*' in these fields (for 'any').# 
    # Notice that tasks will be started based on the cron's system
    # daemon's notion of time and timezones.
    # 
    # For example, you can run a backup of all your user accounts
    # at 5 a.m every week with:
    # 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
    # 
    # m h  dom mon dow   command
    */5 * * * * wget -O /dev/null http://birdcage.yourdomain.com/daemon/index

    This in turn calls our getStreams method which performs the operations described above (note, the streams' functionality will be described in part three of this series):

        public function actionIndex() {
    	  // if not using twitter streams, we'll process tweets by REST API
    	  if (!Yii::app()->params['twitter_stream']) {
    	    Tweet::model()->getStreams();	    
    	  } else {
    	    Stream::model()->process();
    	  }
      }

    The end result looks something like this:

    Birdcage statuses home_timeline via Twitter API

    One time I did run into some Twitter API reliability problems. You can check the status of the Twitter API services here.

    Posting a Tweet

    Posting tweets to your Twitter account is actually quite straightforward. We just need to make use of the statuses/update REST method. It takes a bit more work to perform accurate character counts. 

    Twitter resolves all URLs into http://t.co shortcuts, so all URLs are counted as 20 characters. I needed JavaScript that would count characters and adjust for any URL by 20 characters regardless of the length of a URL. I settled on a combination of jQuery and JavaScript solutions, which I'll detail below.

    I chose to create a model specifically for composing tweets called Status.php. This made it easier to work with Yii to generate forms for posting to the API. 

    When you click on Compose in the Birdcage menu, it will take you to the Compose method of the StatusController:

        public function actionCompose($id=0)
    	{	  
    	  if (!UserSetting::model()->checkConfiguration(Yii::app()->user->id)) {
          Yii::app()->user->setFlash('warning','Please configure your Twitter settings.');
    			$this->redirect(array('/usersetting/update'));					    
    	  }
    		
    		$model=new Status;
    		$model->account_id = $id;
    
    		// Uncomment the following line if AJAX validation is needed
    		// $this->performAjaxValidation($model);
    
    		if(isset($_POST['Status']))
    		{
    			$model->attributes=$_POST['Status'];
    			if ($model->account_id=='' or $model->account_id==0) {
    		    Yii::app()->user->setFlash('no_account','You must select an account before tweeting.'); 			  
    			  $this->redirect(array('status/compose'));
    			}
          $model->created_at =new CDbExpression('NOW()'); 
          $model->modified_at =new CDbExpression('NOW()');
    
    			if($model->save()) {
    			  $account = Account::model()->findByPK($model->account_id);
    			  $twitter = Yii::app()->twitter->getTwitterTokened($account['oauth_token'], $account['oauth_token_secret']);
            // retrieve tweets up until that last stored one
            $tweets= $twitter->post("statuses/update",array('status'=>$model->tweet_text)); 
    				$this->redirect(array('view','id'=>$model->id));
            
    			}
    		}
    
    		$this->render('compose',array(
    			'model'=>$model,
    		));
    	}

    This will load the HTML form for creating a Status item. Check out the _form.php in /app/protected/views/status/.

    First, I'll load several jQuery and JavaScript libraries for character counting:

    $baseUrl = Yii::app()->baseUrl; 
    $cs = Yii::app()->getClientScript();
    $cs->registerScriptFile($baseUrl.'/js/jquery.simplyCountable.js');
    $cs->registerScriptFile($baseUrl.'/js/twitter-text.js');
    $cs->registerScriptFile($baseUrl.'/js/twitter_count.js');
    

    I used a combination of the jQuery simplyCountable plugin, twitter-text.js (a JavaScript-based Twitter text-processing script) and a script that did the heavy lifting of URL adjustments: twitter_count.js.

    The following code then creates the remainder of the compose form and activates the character counting scripts:

    <?php $form=$this->beginWidget('bootstrap.widgets.TbActiveForm',array(
        'id'=>'status-form',
    	'enableAjaxValidation'=>false,
    )); ?>
    
    <?php 
      if(Yii::app()->user->hasFlash('no_account')
        ) {
      $this->widget('bootstrap.widgets.TbAlert', array(
          'alerts'=>array( // configurations per alert type
      	    'no_account'=>array('block'=>true, 'fade'=>true, 'closeText'=>'×'), 
          ),
      ));
    }
    ?>
    
    	<p class="help-block">Fields with <span class="required">*</span> are required.</p>
    
    	<?php echo $form->errorSummary($model); ?>
    
      <?php 
        if ($model->account_id == 0 ) {
          echo CHtml::activeLabel($model,'account_id',array('label'=>'Tweet with Account:')); 
          $model->account_id = 1;
          echo CHtml::activeDropDownList($model,'account_id',Account::model()->getList(),array('empty'=>'Select an Account'));
        } else {
          echo CHtml::hiddenField('account_id',$model->account_id);
            }
      ?>
    
      <br />
    	<?php 
    	echo $form->textAreaRow($model,'tweet_text',array('id'=>'tweet_text','rows'=>6, 'cols'=>50, 'class'=>'span8'));
       ?>
       <p class="right">Remaining: <span id="counter2">0</span></p>
    
    	<div class="form-actions">
    		<?php $this->widget('bootstrap.widgets.TbButton', array(
    			'buttonType'=>'submit',
    			'type'=>'primary',
    			'label'=>$model->isNewRecord ? 'Create' : 'Save',
    		)); ?>
    	</div>
    
    <?php $this->endWidget(); ?>
    <script type="text/javascript" charset="utf-8">
    	$(document).ready(function()
    	{
    	  $('#tweet_text').simplyCountable({
    	    counter: '#counter2',
          maxCount: 140,
          countDirection: 'down'
    	  });
    	});
    </script>
    

    The result looks like this:

    The Birdcage Compose a Tweet PHP Twitter API Example

    When the tweet is saved, it executes this code in the StatusController—which posts the resulting tweet_text to Twitter via OAuth:

        		if($model->save()) {
    			  $account = Account::model()->findByPK($model->account_id);
    			  $twitter = Yii::app()->twitter->getTwitterTokened($account['oauth_token'], $account['oauth_token_secret']);
            // retrieve tweets up until that last stored one
            $tweets= $twitter->post("statuses/update",array('status'=>$model->tweet_text)); 
    				$this->redirect(array('view','id'=>$model->id));        
    			}

    Next Steps

    In this part of the series, we've reviewed how to authenticate with the Twitter API via OAuth, how to query for ranges of tweets in the user's timeline, and how to count characters and post tweets via the API. I hope you've found this useful.

    Part three will cover using the Twitter Streaming API and the open source Phirehose streaming implementation.

    Please post any comments, corrections, or additional ideas below. You can browse my other Tuts+ tutorials on my author page or follow me on Twitter @reifman.

     

    0 Comments

    Leave a comment › Posted in: Daily

    1. Building With the Twitter API: Getting Started

      Final product image
      What You'll Be Creating

      Getting started with the Twitter API may seem a bit complicated, but it’s actually quite straightforward. There are a number of libraries for common languages that make the process quite easy. Twitter’s API documentation has also improved greatly over time.

      In March 2013, Twitter began requiring OAuth for all API sessions, and that’s led to improvements for users and developers all round. OAuth’s secure authenticated platform has helped Twitter protect user privacy while improving tracking; this in turn has allowed the company to increase the limits for developers on individual API calls.

      This series consists of three parts. Part one will cover:

      • an introduction to the Twitter API
      • building a database schema for Twitter
      • building out a PHP application in the Yii Framework for working with Twitter

      Birdcage, our basic Yii Twitter application used in this tutorial, is available to you via open source. If you’d like to learn more about Yii, check out Introduction to the Yii Framework.

      Part two of this series will cover: 

      • authentication with Twitter via OAuth
      • processing incoming tweets in the background using the REST API
      • posting tweets

      Part three will cover use of the real time Twitter Streaming API and the open source Phirehose streaming implementation. While part two processes tweets using the REST API, part three will describe how to build an always-on connection with the Twitter data stream. This may be a new topic for many PHP developers.

      Introduction to the Twitter API

      For the most part, this series of tutorials will focus on three parts of the Twitter platform: 

      1. OAuth authentication
      2. the REST API
      3. the Streaming API

      You can read the Twitter API documentation here.

      OAuth Authentication

      As of version 1.1, the Twitter API now requires OAuth authentication, either application-only authentication or application-user authentication. The latter requires your Twitter user to click through to the Twitter website, sign in with their credentials, and then return to your site. Application-user authentication is required for many user-specific API calls. 

      In other words, when you begin to access the Twitter API on behalf of a user, your user will be directed to Twitter to authorize your application. Twitter will return tokens which do not expire until the user revokes them. You’ll use these tokens to authenticate your calls on behalf of this user.

      The REST API

      The most common way to access Twitter data is through the REST API. Using the secure tokens obtained via OAuth, your application makes requests to Twitter for specific data, e.g. the user's home timeline or their own statuses, or a request to post a tweet for a specific user.

      Using the REST API with Twitter

      The REST API should meet the needs of most Twitter application programmers. 

      The Streaming API

      The Twitter Streaming API allows you to receive tweets and notifications in real time from Twitter. However, it requires a high-performance, persistent, always-on connection between your server and Twitter. 

      Fortunately, there is a great open-source library called Phirehose by Fenn Bailey which implements most of the Twitter streaming API requirements. We'll review how to set up Phirehose and adapt it to your application in part three of this tutorial.

      There are three variations of the Twitter Streaming API:

      1. The Public Stream. This allows your application to monitor public data on Twitter, such as public tweets, hashtag filters, etc. 
      2. The User Stream. This allows you to track a user's tweet stream in real time. Part three of this tutorial will focus on the user stream. 
      3. Site Streams (require prior approval from Twitter). Site streams allow your application to monitor real-time Twitter feeds for a large number of users. 

      The job of your streaming implementation is to log the incoming events as quickly as possible and process them in the background using the REST API as necessary to harvest deeper data. Twitter sometimes calls this gathering of deeper data about events "hydrating".

      Use of the REST API is subject to various rate limits by Twitter. It’s important to be a responsible user of Twitter’s API by planning limits to your activity within your application and monitoring the API rate limit responses. The Streaming API doesn't have rate limits since data is pushed to your server as it comes in.

      Building a Database Schema for the Twitter API

      While Twitter from a distance seems simple, it’s actually a very deep and complex data stream including the always growing timeline, relationships between users, mentions, notifications, favorites, lists, geolocation, multimedia, places, et al. 

      As a developer you have to decide which of this data is most important to your application to store in your own database. A minimalist approach may serve you well. The Twitter API is flexible enough that you can always go back and expand (or hydrate) the data related to the events you store.

      Birdcage is a free, open-source, Yii-based application which implements the Twitter API framework for the purposes of this tutorial. If you're not familiar with Yii, please read my Introduction to the Yii Framework. Even if you're unfamiliar with Yii, the individual PHP code segments in this tutorial should be quite easy to follow. 

      If you wish to see a Twitter API implementation in basic PHP, check out Adam Greene's 140Dev. He's done a great job providing a basic platform for Twitter API access. His book Twitter API Engagement Programming has a creative take on using the Twitter API to organically build your influence on Twitter. 

      One of the benefits of Birdcage and the Yii Framework code is that we can use Yii's scaffolding component, Gii, to generate a web user interface for the basic application in minutes—something that you just can't do in basic PHP.

      As a Yii application, Birdcage uses ActiveRecord database migrations to build its database. Database migrations make it possible to programmatically build and extend our schema. This is especially useful if you implement a minimalist consumption of the Twitter API and later choose to expand what you gather.

      I'll walk you through several examples of constructing the database schema in Yii and the power of its web-based scaffolding constructor, Gii.

      If you'd like to try out the Birdcage code on your own, please visit my site for a complete walk-through of installation instructions.

      First, we’ll create an account table to store the OAuth tokens and secrets from Twitter for the accounts we wish to log in for. These accounts are linked to the registered user in the user table by our internal user_id.

      From the command line, we'll tell Yii to create a new table migration for Twitter accounts: ./app/protected/yiic migrate create create_account_table.

      We'll complete the migration manually like this:

      <?php
      
      class m140911_212834_create_account_table extends CDbMigration
      {
         protected $MySqlOptions = 'ENGINE=InnoDB CHARSET=utf8 COLLATE=utf8_unicode_ci';
         public $tablePrefix;
         public $tableName;
      
         public function before() {
           $this->tablePrefix = Yii::app()->getDb()->tablePrefix;
           if ($this->tablePrefix <> '')
             $this->tableName = $this->tablePrefix.'account';
         }
      
           public function safeUp()
       	{
       	  $this->before();
        $this->createTable($this->tableName, array(
                   'id' => 'pk',
                   'user_id' => 'integer default 0',
                   'screen_name' => 'string NOT NULL',
                   'oauth_token' => 'string NOT NULL',
                   'oauth_token_secret' => 'string NOT NULL',
                   'last_checked' => 'TIMESTAMP DEFAULT 0',
                   'created_at' => 'DATETIME NOT NULL DEFAULT 0',
                   'modified_at' => 'TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP',
                     ), $this->MySqlOptions);
                     $this->addForeignKey('fk_account_user', $this->tableName, 'user_id', $this->tablePrefix.'users', 'id', 'CASCADE', 'CASCADE');
       	}
      
       	public function safeDown()
       	{
       	  	$this->before();
       	  	$this->dropForeignKey('fk_account_user', $this->tableName);
       	    $this->dropTable($this->tableName);
       	}
      }

      To have Yii run the migration which will construct the SQL table, we do this: ./app/protected/yiic migrate up

      You'll see something like this:

      Example of Yii Migration Tool

      We can use Gii, Yii's web-based scaffolding generator, to build our model view controllers for the database.

      Gii - the Yii Code Generator

      In my development environment, I point my web browser to localhost:8888/twitter/app/gii, type in my Gii password (stored in my twitter.ini file), and choose Model Generator:

      Using Gii to Build Our Model Code

      It only takes a second and should show this success message:

      Gii Model Code Successfully Generated

      The model code Gii generates can be used to build a variety of methods related to the Account table. But Gii can also generate the beginnings of the web user interface for managing Twitter accounts.

      Click on Bootstrap Generator, specify the Account model, and Gii will build out the scaffolding for your Account web user interface: 

      Using the Gii Bootstrap Generator for Web Scaffolding

      The resulting code creates a default model view controller user interface which looks something like this:

      Gii Default Controller

      Using Yii Active Record Migrations and Gii is an extremely powerful timesaver for building out a basic web user interface. Once the default scaffolding code is in place, it's straightforward to customize and extend it.

      Next, we’ll build the database tables for storing Twitter data, including our Twitter_User and Tweet tables. Here's the Twitter_User table:

            $this->createTable($this->tableName, array(
                     'id' => 'pk',
                     'twitter_user_id' => 'bigint(20) unsigned NOT NULL',
                     'screen_name' => 'string NOT NULL',
                     'name' => 'string DEFAULT NULL',
                     'profile_image_url' => 'string DEFAULT NULL',
                     'location' => 'string DEFAULT NULL',
                     'url' => 'string DEFAULT NULL',
                     'description' => 'string DEFAULT NULL',
                     'followers_count' => 'int(10) unsigned DEFAULT NULL',
                     'friends_count' => 'int(10) unsigned DEFAULT NULL',
                     'statuses_count' => 'int(10) unsigned DEFAULT NULL',
                     'time_zone' => 'string DEFAULT NULL',
                     'created_at' => 'DATETIME NOT NULL DEFAULT 0',
                     'modified_at' => 'TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP',
                       ), $this->MySqlOptions);
                       $this->createIndex('twitter_user_id', $this->tableName , 'twitter_user_id', true);               

      Here's the Tweet table:

          $this->createTable($this->tableName, array(
                   'id' => 'pk',
                   'account_id'=>'integer default 0',
                   'twitter_user_id'=>'bigint(20) unsigned NOT NULL',
                   'last_checked' => 'TIMESTAMP DEFAULT 0',
                   'tweet_id' => 'BIGINT(20) unsigned NOT NULL',
                   'tweet_text' => 'TEXT NOT NULL',
                   'is_rt' => 'TINYINT DEFAULT 0',
                   'created_at' => 'DATETIME NOT NULL DEFAULT 0',
                   'modified_at' => 'TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP',
                     ), $this->MySqlOptions);
                     $this->createIndex('tweet_id', $this->tableName , 'tweet_id', true);               
                     $this->addForeignKey('fk_tweet_account', $this->tableName, 'account_id', $this->tablePrefix.'account', 'id', 'CASCADE', 'CASCADE');
                     $this->addForeignKey('fk_tweet_user_id', $this->tableName, 'twitter_user_id', $this->tablePrefix.'twitter_user', 'twitter_user_id', 'CASCADE', 'CASCADE');
      

      Notice that we use a foreign key relation to the TwitterUser table. Gii smartly builds relations for us in the Tweet model:

          public function relations()
      	{
      		// NOTE: you may need to adjust the relation name and the related
      		// class name for the relations automatically generated below.
      		return array(
      			'hashtags' => array(self::HAS_MANY, 'Hashtag', 'tweet_id'),
      			'mentions' => array(self::HAS_MANY, 'Mention', 'tweet_id'),
      			'account' => array(self::BELONGS_TO, 'Account', 'account_id'),
      			'twitterUser' => array(self::BELONGS_TO, 'TwitterUser', 'twitter_user_id'),
      			'urls' => array(self::HAS_MANY, 'Url', 'tweet_id'),		  
      		);
      	}
      

      Yii Active Record then automatically manages joins for us. Thus, you can reference TwitterUser properties in your queries with code such as echo $tweet->twitterUser->profile_image_url.

      In general, Birdcage is intended as a simple framework that you might expand on your own. I did not make great attempts here to minimize storage space based on Twitter size definitions or to optimize relations within the schema. I primarily designed this for personal use.

      Twitter pre-parses tweets into what it calls entities, which help filter out metadata for you. These are divided into Mentions, URLs, and Hashtags.

      Mentions

      When tweets mention other users, e.g. @tommcfarlin, Twitter provides metadata describing those mentions. Here's the schema we'll use to store them:

          $this->createTable($this->tableName, array(
                   'id' => 'pk',
                   'tweet_id' => 'BIGINT(20) unsigned NOT NULL',
                   'source_user_id'=>'bigint(20) unsigned NOT NULL',
                   'target_user_id'=>'bigint(20) unsigned NOT NULL',
                     ), $this->MySqlOptions);
                     $this->addForeignKey('fk_mention_tweet', $this->tableName, 'tweet_id', $this->tablePrefix.'tweet', 'tweet_id', 'CASCADE', 'CASCADE');
      

      URLs

      Whenever tweets include links, Twitter provides metadata listing them. Here's the schema we'll use for storing URLs included in the tweets:

          $this->createTable($this->tableName, array(
                   'id' => 'pk',
                   'tweet_id' => 'BIGINT(20) unsigned NOT NULL',
                   'url'=>'string NOT NULL',
                     ), $this->MySqlOptions);
          $this->addForeignKey('fk_url_tweet', $this->tableName, 'tweet_id', $this->tablePrefix.'tweet', 'tweet_id', 'CASCADE', 'CASCADE');
      

      Hashtags

      Whenever tweets include hashtags, e.g. #php, Twitter provides metadata describing them. Here's the schema we'll use for storing hashtags included in the tweets:

          $this->createTable($this->tableName, array(
                   'id' => 'pk',
                   'tweet_id' => 'BIGINT(20) unsigned NOT NULL',
                   'tag'=>'string NOT NULL',
                     ), $this->MySqlOptions);
                    $this->addForeignKey('fk_hashtag_tweet', $this->tableName, 'tweet_id', $this->tablePrefix.'tweet', 'tweet_id', 'CASCADE', 'CASCADE');
      

      These migrations build the primary tables with which we'll store data from the Twitter API.

      In the Birdcage code, you'll see that there are a variety of other migrations as well; most of these support the broader application.

      Building the Birdcage Application

      You'll need to register an application with Twitter to obtain your first OAuth application keys. Visit the Twitter Developer site and click Manage Your Apps. Click Create New Apps; I call mine the Twitter Framework for Yii:

      Create an Application for the Twitter API

      The callback URL for using Birdcage should be http://yourdomain.com/twitter/callback. This is the address that Twitter will return OAuth requests to. It's also the address in Birdcage of our Twitter controller for API calls.

      During the creation process, you'll need to configure the app permissions (use read and write for Birdcage) and make notes of the application OAuth key and secret:

      Twitter API Application Permissions

      Here's what the API keys page will appear like. The top API key and secret will be needed to set up Birdcage initially:

      Twitter API Application Keys

      Once you've added the application, you'll see it in the Twitter App menu:

      Manage Your Twitter Apps

      Once you install the code, you'll need to initialize the MySQL database by running the migrations. I do this in two steps.

      First, I run the Yii-User migration. This is an extension for Yii which manages most of my user login and registration needs:

      ./app/protected/yiic migrate --migrationPath=application.modules.user.migrations

      The migration will ask you to create credentials for your primary user account for the web application:

      Admin login [admin]:
      Admin email [webmaster@example.com]:
      Admin password [admin]:

      This is the account you'll use to log into the Birdcage web application, not your Twitter account credentials.

      Then, I run the rest of the migrations: ./app/protected/yiic up

      Once you configure Birdcage, visit your site in your web browser, e.g. http://birdcage.yourdomain.com.

      The BirdCage Twitter API Application Home Page

      Once you log in, it will ask you to enter your application OAuth keys and address for the Twitter controller, e.g. http://yourdomain.com/twitter (just the controller, not the callback) shown in the above images:

      The BirdCage Settings Screen

      Now, we're about ready to make interesting things happen. 

      Conclusion

      Stay tuned for part two of this tutorial. Part two covers: 

      1. OAuth user authentication
      2. processing tweets in the background
      3. posting to Twitter

      Then, in part three, we will look at using the Twitter Streaming API and the open source Phirehose streaming implementation. While part two processes tweets using the REST API, part three will describe how to build an always on connection with the Twitter data stream.

      I hope you've found this useful. Please post any comments, corrections or additional ideas below. You can browse my other Tuts+ tutorials on my author page or follow me on Twitter @reifman.

       

      0 Comments

      Leave a comment › Posted in: Daily

    1. Introduction to Generators & Koa.js: Part 2

      Final product image
      What You'll Be Creating

      Welcome to the second part of our series on generators and Koa. If you missed it you can read read part 1 here. Before starting with the development process, make sure that you have installed Node.js 0.11.9 or higher.

      In this part, we will be creating a dictionary API using Koa.js, and you'll learn about routing, compressing, logging, rate-limiting, and error handling in Koa.js. We will also use Mongo as our datastore and learn briefly about importing data into Mongo and the ease that comes with querying in Koa. Finally, we'll look into debugging Koa apps.

      Understanding Koa

      Koa has radical changes built under its hood which leverage the generator goodness of ES6. Apart from the change in the control flow, Koa introduces its own custom objects, such as this, this.request, and this.response, which conveniently act as a syntactic-sugar layer built on top of Node's req and res objects, giving you access to various convenience methods and getters/setters. 

      Apart from convenience, Koa also cleans up the middleware which, in Express, relied on ugly hacks which often modified core objects. It also provides better stream handling.

      Wait, What's a Middleware?

      A middleware is a pluggable function that adds or removes a particular piece of functionality by doing some work in the request/response objects in Node.js.

      Koa's Middleware

      A Koa middleware is essentially a generator function that returns one generator function and accepts another. Usually, an application has a series of middleware that are run for each request. 

      Also, a middleware must yield to the next 'downstream' middleware if it is run by an 'upstream middleware'. We will discuss more about this in the error handling section.

      Building Middleware

      Just one last thing: To add a middleware to your Koa application, we use the koa.use() method and supply the middleware function as the argument. Example: app.use(koa-logger) adds koa-logger to the list of middleware that our application uses.

      Building the Application

      To start with the dictionary API, we need a working set of definitions. To recreate this real-life scenario, we decided to go with a real dataset. We took the definition dump from Wikipedia and loaded it into Mongo. The set consisted of about 700,000 words as we imported only the English dump. Each record (or document) consists of a word, its type, and its meaning. You can read more about the importing process in the import.txt file in the repository.

      To move along the development process, clone the repository and check your progress by switching to different commits. To clone the repo, use the following command:

      $ git clone https://github.com/bhanuc/dictapi.git

      We can start by creating a base server Koa:

      var koa = require('koa');
      var app = koa();
      
      app.use(function *(next){
          this.type = 'json';
          this.status = 200;
          this.body = {'Welcome': 'This is a level 2 Hello World Application!!'};
      });
      
      if (!module.parent) app.listen(3000);
      console.log('Hello World is Running on http://localhost:3000/');
      
      

      In the first line, we import Koa and save an instance in the app variable. Then we add a single middleware in line 5, which is an anonymous generator function that takes the next variable as a parameter. Here, we set the type and status code of the response, which is also automatically determined, but we can also set those manually. Then finally we set the body of the response. 

      Since we have set the body in our first middleware, this will mark the end of each request cycle and no other middleware will be involved. Lastly, we start the server by calling its listen method and pass on the port number as a parameter.

      We can start the server by running the script via:

      $ npm install koa
      $ node --harmony index.js

      You can directly reach this stage by moving to commit 6858ae0:

      $ git checkout 6858ae0

      Adding Routing Capabilities

      Routing allows us to redirect different requests to different functions on the basis of request type and URL. For example, we might want to respond to /login differently than signup. This can be done by adding a middleware, which manually checks the URL of the request received and runs corresponding functions. Or, instead of manually writing that middleware, we can use a community-made middleware, also known as a middleware module.

      To add routing capability to our application, we will use a community module named koa-router

      To use koa-router, we will modify the existing code to the code shown below:

      var koa = require('koa');
      var app = koa();
      var router = require('koa-router');
      var mount = require('koa-mount');
      
      var handler = function *(next){
          this.type = 'json';
          this.status = 200;
          this.body = {'Welcome': 'This is a level 2 Hello World Application!!'};
      };
      
      var APIv1 = new router();
      APIv1.get('/all', handler);
      
      app.use(mount('/v1', APIv1.middleware()));
      if (!module.parent) app.listen(3000);
      console.log('Hello World is Running on http://localhost:3000/');
      
      

      Here we have imported two modules, where router stores koa-router and mount stores the koa-mount module, allowing us to use the router in our Koa application.

      On line 6, we have defined our handler function, which is the same function as before but here we have given it a name. On line 12, we save an instance of the router in APIv1, and on line 13 we register our handler for all the GET requests on route /all

      So all the requests except when a get request is sent to localhost:3000/all will return "not found". Finally on line 15 , we use mount middleware, which gives a usable generator function that can be fed to app.use().

      To directly reach this step or compare your application, execute the following command in the cloned repo:

      $ git checkout 8f0d4e8

      Before we run our application, now we need to install koa-router and koa-mount using npm. We observe that as the complexity of our application increases, the number of modules/dependencies also increases. 

      To keep track of all the information regarding the project and make that data available to npm, we store all the information in package.json including all the dependencies. You can create package.json manually or by using an interactive command line interface which is opened using the $ npm init  command.

      {
          "name": "koa-api-dictionary",
          "version": "0.0.1",
          "description": "koa-api-dictionary application",
          "main": "index",
          "author": {
          "name": "Bhanu Pratap Chaudhary",
          "email": "bhanu423@gmail.com"
          },
          "repository": {
          "type": "git",
          "url": "https://github.com/bhanuc/dictapi.git"
          },
          "license": "MIT",
          "engines": {
          "node": ">= 0.11.13"
          }
      }
      
      

      A very minimal package.json file looks like the one above. 

      Once package.json is present, you can save the dependency using the following command:

      $ npm install <package-name> --save

      For example: In this case, we will install the modules using the following command to save the dependencies in package.json.

      $ npm install koa-router koa-mount --save

      Now you can run the application using $ node --harmony index.js

      You can read more about package.json here.

      Adding Routes for the Dictionary API

      We will start by creating two routes for the API, one for getting a single result in a faster query, and a second to get all the matching words (which is slower for the first time). 

      To keep things manageable, we will keep all the API functions in a separate folder called api and a file called api.js, and import it later in our main index.js file.

      var monk = require('monk');
      var wrap = require('co-monk');
      var db = monk('localhost/mydb');
      var words = wrap(db.get('words'));
      /**
      * GET all the results.
      */
      exports.all = function *(){
          if(this.request.query.word){
              var res = yield words.find({ word : this.request.query.word });
              this.body = res;
          } else {
              this.response.status = 404;
              }
          };
      /**
      * GET a single result.
      */
      exports.single = function *(){
          if(this.request.query.word){
              var res = yield words.findOne({ word : this.request.query.word });
              this.body = res;
          } else {
              this.response.status = 404;
          }
      };

      Here we are using co-monk, which acts a wrapper around monk, making it very easy for us to query MongoDB using generators in Koa. Here, we import monk and co-monk, and connect to the MongoDB instance on line 3. We call wrap() on collections, to make them generator-friendly. 

      Then we add two generator methods named all and single as a property of the exports variable so that they can be imported in other files. In each of the functions, first we check for the query parameter 'word.' If present, we query for the result or else we reply with a 404 error. 

      We use the yield keyword to wait for the results as discussed in the first article, which pauses the execution until the result is received. On line 12, we use the find method, which returns all the matching words, which is stored in res and subsequently sent back. On line 23, we use the findOne method available on the collection, which returns the first matching result. 

      Assigning These Handlers to Routes

      var koa = require('koa');
      var app = koa();
      var router = require('koa-router');
      var mount = require('koa-mount');
      var api = require('./api/api.js');
      
      var APIv1 = new router();
      APIv1.get('/all', api.all);
      APIv1.get('/single', api.single);
      
      
      app.use(mount('/v1', APIv1.middleware()));
      if (!module.parent) app.listen(3000);
      console.log('Dictapi is Running on http://localhost:3000/');

      Here we import exported methods from api.js and we assign handlers to GET routes /all  /single and we have a fully functional API and application ready.

      To run the application, you just need to install the monk and co-monk modules using the command below. Also, ensure you have a running instance of MongoDB in which you have imported the collection present in the git repository using the instructions mentioned in import.txtweird.

      $ npm install monk co-monk --save

      Now you can run the application using the following command:

      $ node --harmony index.js

      You can open the browser and open the following URLs to check the functioning of the application. Just replace 'new' with the word you want to query.

      • http://localhost:3000/v1/all?word=new
      • http://localhost:3000/v1/single?word=new

      To directly reach this step or compare your application, execute the following command in the cloned repo:

      $ git checkout f1076eb  

      Error Handling in Koa

      By using cascading middlewares, we can catch errors using the try/catch mechanism, as each middleware can respond while yielding to downstream as well as upstream. So, if we add a Try and Catch middleware in the beginning of the application, it will catch all the errors encountered by the request in the rest of the middleware as it will be the last middleware during upstreaming. Adding the following code on line 10 or before in index.js should work.

      app.use(function *(next){
      try{
          yield next; //pass on the execution to downstream middlewares
      } catch (err) { //executed only when an error occurs & no other middleware responds to the request
      this.type = 'json'; //optional here
      this.status = err.status || 500;
      this.body = { 'error' : 'The application just went bonkers, hopefully NSA has all the logs wink '};
      //delegate the error back to application
      this.app.emit('error', err, this);
          }
      });

      Adding Logging and Rate-Limiting to the Application

      Storing logs is an essential part of a modern-day application, as logs are very helpful in debugging and finding out issues in an application. They also store all the activities and thus can be used to find out user activity patterns and interesting other patterns. 

      Rate-limiting has also become an essential part of modern-day applications, where it is important to stop spammers and bots from wasting your precious server resources and to stop them from scraping your API.

      It is fairly easy to add logging and rate-limiting to our Koa application. We will use two community modules: koa-logger and koa-better-rate-limiting. We need to add the following code to our application:

      var logger = require('koa-logger');
      var limit = require('koa-better-ratelimit');
      //Add the lines below just under error middleware.
      app.use(limit({ duration: 1000*60*3 , // 3 min
                      max: 10, blacklist: []}));
      app.use(logger());

      Here we have imported two modules and added them as middleware. The logger will log each request and print in the stdout of the process which can be easily saved in a file. And limit middleware limits the number of requests a given user can request in a given timeframe (here it is maximum ten requests in three minutes). Also you can add a array of IP addresses which will be blacklisted and their request will not be processed.

      Do remember to install the modules before using the code using: 

      $ npm install koa-logger koa-better-ratelimit --save

      Compressing the Traffic

      One of the ways to ensure faster delivery is to gzip your response, which is fairly simple in Koa. To compress your traffic in Koa, you can use the koa-compress module. 

      Here, options can be an empty object or can be configured as per the requirement.

      var compress = require('koa-compress');
      var opts =  { 
          filter: function (content_type) { return /text/i.test(content_type) }, // filter requests to be compressed using regex 
          threshold: 2048, //minimum size to compress
          flush: require('zlib').Z_SYNC_FLUSH };
                  }
      //use the code below to add the middleware to the application
      app.use(compress(opts));
      

      You can even turn off compression in a request by adding the following code to a middleware:

      this.compress = true;

      Don't forget to install compress using npm

      $ npm install compress --save 

      To directly reach this step or compare your application, execute the following command in the cloned repo:

      git checkout 8f5b5a6 

      Writing Tests

      Test should be an essential part of all code, and one should target for maximum test coverage. In this article, we will be writing tests for the routes that are accessible from our application. We will be using supertest and Mocha to create our tests. 

      We will be storing our test in test.js in the api folder. In both tests, we first describe our test, giving it a more human readable name. After that, we will pass an anonymous function which describes the correct behavior of the test, and takes a callback which contains the actual test. In each test, we import our application, initiate the server, describe the request type, URL and query, and then set encoding to gzip.  Finally we check for the response if it's correct.

      var request = require('supertest');
      var api = require('../index.js');
      
      describe('GET all', function(){
        it('should respond with all the words', function(done){
          var app = api;
          request(app.listen())
          .get('/v1/all')
          .query({ word: 'new' })
          .set('Accept-Encoding', 'gzip')    
          .expect('Content-Type', /json/)
          .expect(200)
          .end(done);
        })
      })
      
      describe('GET /v1/single', function(){
        it('should respond with a single result', function(done){
          var app = api;
      
          request(app.listen())
          .get('/v1/single')
          .query({ word: 'new' })
          .set('Accept-Encoding', 'gzip')
          .expect(200)
          .expect('Content-Type', /json/)
          .end(function(err, res){
          if (err) throw err;
          else {
              if (!('_id' in res.body)) return "missing id";
              if (!('word' in res.body)) throw new Error("missing word");
              done();
          }
        });
        })
      })

      To run our test, we will make a Makefile:

      test:
          @NODE_ENV=test ./node_modules/.bin/mocha \
      		--require should \
      		--reporter nyan \
      		--harmony \
      		--bail \
      		api/test.js
      
      .PHONY: test

      Here, we've configured the reporter (nyan cat) and the testing framework (mocha). Note that the import should add --harmony to enable ES6 mode. Finally, we also specify the location of all the tests. A Makefile can be configured for endless testing of your application.

      Now to test your app, just use the following command in the main directory of the application. 

      $ make test

      Just remember to install testing modules (mocha, should, supertest) before testing, using the command below: 

      $ npm install mocha should mocha --save-dev 

      Running in Production

      To run our applications in production, we will use PM2, which is an useful Node process monitor. We should disable the logger app while in production; it can be automated using environment variables.

      To install PM2, enter the following command in terminal

      $ npm install pm2 -g 

      And our app can be launched using the following command:

      $ pm2 start index.js --node-args="--harmony" 

      Now, even if our application crashes, it will restart automatically and you can sleep soundly. 

      Conclusion

      Koa is a light and expressive middleware for Node.js that makes the process of writing web applications and APIs more enjoyable. 

      It allows you to leverage a multitude of community modules to extend the functionality of your application and simplify all the mundane tasks, making web development a fun activity. 

      Please don't hesitate to leave any comments, questions, or other information in the field below.








       

      0 Comments

      Leave a comment › Posted in: Daily

    1. Introduction to Tablesorter

      Final product image
      What You'll Be Creating

      Tablesorter is a straightforward jQuery plugin that provides dynamic column sorting and pagination in your HTML tables. It's a nice way to provide sortable, scripted tables that don't require the user to refresh the page. You can also use it when you're using Ajax in your application.

      This tutorial will showcase actual code and three examples of using Tablesorter. You can download the code at GitHub. Note that the Tablesorter download is actually missing a few graphic images for its pagers, so you may want to use my GitHub files.

      Download the code for this Tablesorter demo at Github

      Example 1: Basic Tablesorter

      My first example shows you how to use Tablesorter to provide a sortable list of Internet domains for sale. You can see the demo here and the code here.

      There are a few components that we need to set up for Tablesorter. First, we have to load jQuery and the tablesorter plugin. I'll also load its blue CSS theme:

      <script type="text/javascript" src="./js/jquery-latest.js"></script> 
          <script type="text/javascript" src="./js/jquery.tablesorter.min.js"></script>
        <link rel="stylesheet" href="./themes/blue/style.css" type="text/css" media="print, projection, screen" />

      Then, we'll build the table HTML:

      <table id="domainsTable" class="tablesorter"> 
          <thead> 
          <tr> 
              <th>Domain name</th> 
              <th>gTld</th> 
              <th>Category</th> 
              <th>Price</th> 
              <th>Contact</th> 
          </tr> 
          </thead> 
          <tbody> 
            <tr><td><a href="http://geogram.co">geogram.co</a></td><td>co</td><td>Internet</td><td>$49</td><td><a href="mailto:jeff@lookahead.io?subject=Offer for domain name: geogram.co">Purchase</a></td></tr>
            <tr><td><a href="http://newscloud.com">newscloud.com</a></td><td>com</td><td>News</td><td>$19999</td><td><a href="mailto:jeff@lookahead.io?subject=Offer for domain name: newscloud.com">Purchase</a></td></tr>
            <tr><td><a href="http://popcloud.com">popcloud.com</a></td><td>com</td><td>Music</td><td>$14999</td><td><a href="mailto:jeff@lookahead.io?subject=Offer for domain name: popcloud.com">Purchase</a></td></tr>
      <!-- ... -->
      </tbody> 
          </table> 
      

      After that, we need to initialize Tablesorter when the page loads:

      <script>
          $(document).ready(function() 
              { 
                  $("#domainsTable").tablesorter({sortList: [[3,1],[2,0]]}); 
              } 
          );
          </script>
          </body>

      In the example above, I'm setting the fourth column, which is the price column, to sort in descending order, and I'm setting the third column, which is the category column, to sort in ascending order. 

      Once done, you should see something like this:

      Basic tablesorter

      If you're not loading your tables dynamically from a database, you might be wondering if there's an easier way to generate HTML table code from long lists. There is! I describe it in How to Park, List and Sell Your Domains.

      Basically, I'm using a Google Drive spreadsheet which lists my domains, categories, and prices, and I use concatenate functions to generate Apache server aliases, JavaScript pricing code, and the Tablesorter table row HTML:

      My Google Drive HTML Generating Spreadsheet

      Here's what a concatenate function looks like in Google Drive:

      =CONCATENATE("<tr>","<td>",F2,"</td>","<td>",B2,"</td>","<td>",D2,"</td>","<td>$", TO_DOLLARS(E2),"</td>","<td>",G2,"</td>","</tr>")

      I use the Domena theme available at Envato Market as landing pages for each domain:

      My Domains for Sale page powered by Domena Theme

      I've customized JavaScript in the theme to change the price based on the domain that's loaded. I think the newer versions of Domena handle multiple domains more elegantly.

      Example 2: Paging With Tablesorter

      Now, we'll show you how to implement paging with Tablesorter. You can see the demo here and get the code here. It should look something like this:

      Paging with Tablesorter

      This time, we'll initialize Tablesorter in the <head> tag. We'll also add the Tablesorter plugin script:

       <script type="text/javascript" src="./js/jquery.tablesorter.pager.js"></script> 
       <script type="text/javascript">
      $(function() {
          	$("table")
      			.tablesorter({widthFixed: true, widgets: ['zebra']})
      			.tablesorterPager({container: $("#pager")});
      	});
      	</script>

      We'll place the HTML div for the pager below the table:

      </table> 
          <div id="pager" class="pager">
              <form>
          		<img src="./addons/pager/icons/first.png" class="first"/>
          		<img src="./addons/pager/icons/prev.png" class="prev"/>
          		<input type="text" class="pagedisplay"/>
          		<img src="./addons/pager/icons/next.png" class="next"/>
          		<img src="./addons/pager/icons/last.png" class="last"/>
          		<select class="pagesize">
          			<option selected="selected"  value="10">10</option>
          			<option value="20">20</option>
          			<option value="30">30</option>
          			<option  value="40">40</option>
          		</select>
          	</form>
          </div>
          </body>

      That's it.

      Note that I found the pager icons had been deleted from the Tablesorter GitHub site, so I downloaded them manually from the demo. It may be easiest for you to get them from the forked version of the Tuts+ repository.

      Example 3: Ajax Loading

      Now we'll look at how to use jQuery to populate a Tablesorter table dynamically. To start with, we'll initialize a Tablesorter table with just .IO domains. It'll look something like this:

      The AJAX Tablesorter Demo

      When you click the Add .COM Domains link, you'll see the table expand with .COM domains.

      You can see the demo here and the code here. The HTML for the Ajax request with the .COM domains is here.

      Here's the code that responds to the click event, loading additional rows via Ajax:

      p><a id="add-com-domains" href="#">Add .COM Domains via AJAX</p>
          <script>
          $(document).ready(function() 
              { 
                  $("#domainsTable").tablesorter({sortList: [[3,1],[2,0]]}); 
                  $("#add-com-domains").click(function() { 
                           $.get("./com-domains.html", function(html) { 
                               // append the "ajax'd" data to the table body 
                               $("table tbody").append(html); 
                              // let the plugin know that we made a update 
                              $("table").trigger("update"); 
                              // set sorting column and direction, this will sort on the third and second column 
                              var sorting = [[3,1],[2,0]]; 
                              $("table").trigger("sorton",[sorting]); 
                          }); 
                          $(this).hide();
                          return false; 
                      });            
              } 
          );
          </script>

      Tablesorter can definitely improve the user experience if used well.

      I hope you've found this tutorial useful. Please feel free to post corrections, questions or comments below. You can also reach me on Twitter @reifman or email me directly.

      Related Links

       

      0 Comments

      Leave a comment › Posted in: Daily

    1. Multi-Instance Node.js App in PaaS Using Redis Pub/Sub

      If you chose PaaS as hosting for your application, you probably had or will have this problem: Your app is deployed to small "containers" (known as dynos in Heroku, or gears in OpenShift) and you want to scale it. 

      In order to do so, you increase the number of containers—and every instance of your app is pretty much running in another virtual machine. This is good for a number of reasons, but it also means that the instances don't share memory. 

      In this tutorial I will show you how to overcome this little inconvenience.

      When you chose PaaS hosting, I assume that you had scaling in mind. Maybe your site already witnessed the Slashdot effect or you want to prepare yourself for it. Either way, making the instances communicate with each other is pretty simple.

      Keep in mind that in the article I will assume that you already have a Node.js app written and running.


      Step 1: Redis Setup

      First, you have to prepare your Redis database. I like to use Redis To Go, because the setup is really quick, and if you are using Heroku there is an add-on (although your account must have a credit card assigned to it). There is also Redis Cloud, which includes more storage and backups.

      From there, the Heroku setup is pretty easy: Select the add-on on the Heroku Add‑ons page, and select Redis Cloud or Redis To Go, or use one of the following commands (note that the first one is for Redis To Go, and the second one is for Redis Cloud):

      $ heroku addons:add redistogo 
      $ heroku addons:add rediscloud

      Step 2: Setting Up node_redis

      At this point, we have to add the required Node module to the package.json file. We will use the recommended node_redis module. Add this line to your package.json file, in the dependencies section:

      "node_redis": "0.11.x"

      If you want, you can also include hiredis, a high-performance library written in C, which node_redis will use if it's available:

      "hiredis": "0.1.x"

      Depending on how you created your Redis database and which PaaS provider you use, the connection setup will look a bit different. You need host, port, username, and password for your connection.

      Heroku

      Heroku stores everything in the config variables as URLs. You have to extract the information you need from them using Node's url module (config var for Redis To Go is process.env.REDISTOGO_URL and for Redis Cloud process.env.REDISCLOUD_URL). This code goes on the top of your main application file:

      var redis = require('redis'); 
      var url = require('url'); 
      
      var redisURL = url.parse(YOUR_CONFIG_VAR_HERE); 
      var client = redis.createClient(redisURL.host, redisURL.port); 
      
      client.auth(redisURL.auth.split(':')[1]); 

      Others

      If you created the database by hand, or use a provider other than Heroku, you should have the connection options and credentials already, so just use them:

      var redis = require('redis'); 
      var client = redis.createClient(YOUR_HOST, YOUR_PORT); 
      client.auth(YOUR_PASSWORD);

      After that we can start working on communication between instances.


      Step 3: Sending and Receiving Data

      The simplest example will just send information to other instances that you've just started. For example, you can display this information in the admin panel.

      Before we do anything, create another connection named client2. I will explain why we need it later.

      Let's start by just sending the message that we started. It's done using the publish() method of the client. It takes two arguments: the channel we want to send the message to, and the message's text:

      client.publish('instances', 'start'); 

      That's all you need to send the message. We can listen for messages in the message event handler (notice that we call this on our second client):

      client2.on('message', function (channel, message) {

      The callback is passed the same arguments that we pass to the publish() method. Now let's display this information in the console:

      if ((channel == 'instances') and (message == 'start')) 
          console.log('New instance started!'); 
      });

      The last thing to do is to actually subscribe to the channel we will use:

      client2.subscribe('instances');

      We used two clients for this because when you call subscribe() on the client, its connection is switched to the subscriber mode. From that point, the only methods you can call on the Redis server are SUBSCRIBE and UNSUBSCRIBE. So if we are in the subscriber mode we can publish() messages.

      If you want you can also send a message when the instance is being shut down—you can listen to the SIGTERM event and send the message to the same channel:

      process.on('SIGTERM', function () { 
          client.publish('instances', 'stop'); 
          process.exit(); 
      }); 

      To handle that case in the message handler add this else if in there:

      else if ((channel == 'instances') and (message == 'stop')) 
          console.log('Instance stopped!');

      So it looks like this afterwards:

      client2.on('message', function (channel, message) { 
      
          if ((channel == 'instances') and (message == 'start')) 
              console.log('New instance started!'); 
          else if ((channel == 'instances') and (message == 'stop')) 
              console.log('Instance stopped!'); 
      
      });

      Note that if you are testing on Windows, it does not support the SIGTERM signal.

      To test it locally, start your app a few times and see what happens in the console. If you want to test the termination message, don't issue the Ctrl+C command in the terminal—instead, use the kill command. Note that this is not supported on Windows, so you can't check it.

      First, use the ps command to check what id your process has—pipe it to grep to make it easier:

      $ ps -aux | grep your_apps_name 

      The second column of the output is the ID for which you are looking. Keep in mind that there will be also a line for the command you just ran. Now execute the kill command using 15 for the signal—it's SIGTERM:

      $ kill -15 PID

      PID is your process ID.


      Real-World Examples

      Now that you know how to use the Redis Pub/Sub protocol, you can go beyond the simple example presented earlier. Here are a few use-cases that may be helpful.

      Express Sessions

      This one is extremely helpful if you are using Express.js as your framework. If your application supports user logins, or pretty much anything that utilizes sessions, you will want to make sure the user sessions are preserved, no matter if the instance restarts, the user moves to a location that is handled by another one, or the user is switched to another instance because the original one went down.

      A few things to remember:

      • The free Redis instances will not suffice: you need more memory than the 5MB/25MB they provide.
      • You will need another connection for this.

      We will need the connect-redis module. The version depends on the version of Express you are using. This one is for Express 3.x:

      "connect-redis": "1.4.7"

      And this for Express 4.x:

      "connect-redis": "2.x"

      Now create another Redis connection named client_sessions. The usage of the module again depends on the Express version. For 3.x you create the RedisStore like this:

      var RedisStore = require('connect-redis')(express)

      And in 4.x you have to pass the express-session as the parameter:

      var session = require('express-session'); 
      var RedisStore = require('connect-redis')(session);

      After that the setup is the same in both versions:

      app.use(session({ store: new RedisStore({ client: client_sessions }), secret: 'your secret string' }));

      As you can see, we are passing our Redis client as the client property of the object passed to RedisStore's constructor, and then we pass the store to the session constructor.

      Now if you start your app, log in, or initiate a session and restart the instance, your session will be preserved. The same happens when the instance is switched for the user.

      Exchanging Data With WebSockets

      Let's say you have a completely separated instance (worker dyno on Heroku) for doing more resource-eating work like complicated calculations, processing data in the database, or exchanging a lot of data with an external service. You will want the "normal" instances (and therefore the users) to know the result of this work when it's done.

      Depending on whether you want the web instances to send any data to the worker, you will need one or two connections (let's name them client_sub and client_pub on the worker too). You can also reuse any connection that is not subscribing to anything (like the one you use for Express sessions) instead of the client_pub.

      Now when the user wants to perform the action, you publish the message on the channel that is reserved just for this user and for this specific job:

      // this goes into your request handler 
      client_pub.publish('JOB:USERID:JOBNAME:START', JSON.stringify(THEDATAYOUWANTTOSEND));
      client_sub.subscribe('JOB:USERID:JOBNAME:PROGRESS');

      Of course you'll have to replace USERID and JOBNAME with appropriate values. You should also have the message handler prepared for the client_sub connection:

      client_sub.on('message', function (channel, message) { 
      
          var USERID = channel.split(':')[1]; 
          
          if (message == 'DONE') 
              client_sub.unsubscribe(channel); 
          
          sockets[USERID].emit(channel, message); 
      
      });

      This extracts the USERID from the channel name (so make sure you don't subscribe to channels not related to user jobs on this connection), and sends the message to the appropriate client. Depending on which WebSocket library you use, there will be some way to access a socket by its ID.

      You may wonder how the worker instance can subscribe to all of those channels. Of course, you don't just want to do a few loops on all possible USERIDs and JOBNAMEs. The psubscribe() method accepts a pattern as the argument, so it can subscribe to all JOB:* channels:

      // this code goes to the worker instance 
      // and you call it ONCE 
      client_sub.psubscribe('JOB:*')

      Common Problems

      There are a few problems you may encounter when using Pub/Sub:

      • Your connection to the Redis server is refused. If this happens, make sure you provide proper connection options and credentials, and that the maximum number of connections has not been reached.
      • Your messages are not delivered. If this happens, check that you subscribed to the same channel you are sending messages on (seems silly, but sometimes happens). Also make sure that you attach the message handler before calling subscribe(), and that you call subscribe() on one instance before you call publish() on the other.

       

      0 Comments

      Leave a comment › Posted in: Daily

  • Page 3 of 35 pages  < 1 2 3 4 5 >  Last ›
  • Browse the Blog

    Syndicate

    governing-bruise