Sunday, 20 December 2015

Live Reload Considered Harmful

I've seen it go by many names; live reload, hot reload, browser sync... the list goes on. It's been the subject of a million demos. It's the focus of a thousand npm packages. Someone tweaks a file and... wait for it... doesn't have to refresh their browser to see the changes... The future is now!

Forgive me the sarcasm, but I have come to the conclusion that whilst live reload is impressive... for my own purposes, it is not actually that useful. It certainly shouldn't be the default goto that it seems to have become.

Hear me out people, I may be the voice crying out in the wilderness but I'm right dammit.

Search your feelings, you know it to be true.

Why is Live Reload a Thing?

What is live reload? Well having to hit F5 after you've made a change... That seems like such hard work right? To quote Phil Haack:

... we’re software developers.... It’s time to AWW TOE MATE!

Yup, automation. Anything that a developer can theoretically automate.... will be automated. Usually this is a good thing but automation can be addictive. And on this occasion it's time for an intervention.

What else could be the attraction? Well, this is speculation but I would say that the implementation actually has something to do with it. Live reload is almost invariably powered by WebSockets and they are certainly cool. Developers I know what you are like. You're attracted by the new shiny thing. You can't resist the allure of WS. And there with live reload idling away in the background you're all bleeding edge. I can say all this because this is exactly what I am like.

Why is Live Reload a BAD Thing?

Well the OCD part of me is instinctively repelled by the extra script tag of alien code that live reload foists upon your app. How very dare that <script src="http://localhost:35729/livereload.js?snipver=1"></script> push its way into my pristine DOM. It's an outrage.

Perhaps a more convincing rationale is how useful it is to have 2 different versions of your app up on screen at the same time. I like to try things out when I'm working. I get a screen working one way and then I tweak and play with my implementation. I have the app of 10 minutes ago sat side by side with the newly adjusted one. Assess, compare and and declare a winner. That's so useful and live reload does away with it. That's a problem.

Finally, I'm an obsessive 'Ctrl-S'-er. I've been burned by unsaved changes too many times. I'm saving every couple of keypresses. With live reload that usually means I have the noise of a dead application in the corner of my eye as LR obsessively forces the latest brokenness upon me. That sucks.

I've no doubt there are situations where live reload is useful. But for my money that's the exception rather than the rule. Let the madness end now. Just say "no", kids.

Wednesday, 16 December 2015

ES6 + TypeScript + Babel + React + Flux + Karma: The Secret Recipe

I wrote a while ago about how I was using some different tools in a current project:

  • React with JSX
  • Flux
  • ES6 with Babel
  • Karma for unit testing

I have fully come to love and appreciate all of the above. I really like working with them. However. There was still an ache in my soul and a thorn in my side. Whilst I love the syntax of ES6 and even though I've come to appreciate the clarity of JSX, I have been missing something. Perhaps you can guess? It's static typing.

It's actually been really good to have the chance to work without it because it's made me realise what a productivity boost having static typing actually is. The number of silly mistakes burning time that a compiler could have told me.... Sigh.

But the pain is over. The dark days are gone. It's possible to have strong typing, courtesy of TypeScript, plugged into this workflow. It's yours for the taking. Take it. Take it now!

What a Guy Wants

I decided a couple of months ago what I wanted to have in my setup:

  1. I want to be able to write React / JSX in TypeScript. Naturally I couldn't achieve that by myself but handily the TypeScript team decided to add support for JSX with TypeScript 1.6. Ooh yeah.
  2. I wanted to be able to write ES6. When I realised the approach for writing ES6 and having the transpilation handled by TypeScript wasn't clear I had another idea. I thought "what if I write ES6 and hand off the transpilation to Babel?" i.e. Use TypeScript for type checking, not for transpilation. I realised that James Brantly had my back here already. Enter Webpack and ts-loader.
  3. Debugging. Being able to debug my code is non-negotiable for me. If I can't debug it I'm less productive. (I'm also bitter and twisted inside.) I should say that I wanted to be able to debug my original source code. Thanks to the magic of sourcemaps, that mad thing is possible.
  4. Karma for unit testing. I've become accustomed to writing my tests in ES6 and running them on a continual basis with Karma. This allows for a rather good debugging story as well. I didn't want to lose this when I moved to TypeScript. I didn't.

So I've talked about what I want and I've alluded to some of the solutions that there are. The question now is how to bring them all together. This post is, for the most part, going to be about correctly orchestrating a number of gulp tasks to achieve the goals listed above. If you're after the Blue Peter "here's one I made earlier" moment then take a look at the es6-babel-react-flux-karma repo in the Microsoft/TypeScriptSamples repo on Github.

gulpfile.js

/* eslint-disable no-var, strict, prefer-arrow-callback */
'use strict';

var gulp = require('gulp');
var gutil = require('gulp-util');
var connect = require('gulp-connect');
var eslint = require('gulp-eslint');
var webpack = require('./gulp/webpack');
var staticFiles = require('./gulp/staticFiles');
var tests = require('./gulp/tests');
var clean = require('./gulp/clean');
var inject = require('./gulp/inject');

var lintSrcs = ['./gulp/**/*.js'];

gulp.task('delete-dist', function (done) {
  clean.run(done);
});

gulp.task('build-process.env.NODE_ENV', function () {
  process.env.NODE_ENV = 'production';
});

gulp.task('build-js', ['delete-dist', 'build-process.env.NODE_ENV'], function(done) {
  webpack.build().then(function() { done(); });
});

gulp.task('build-other', ['delete-dist', 'build-process.env.NODE_ENV'], function() {
  staticFiles.build();
});

gulp.task('build', ['build-js', 'build-other', 'lint'], function () {
  inject.build();
});

gulp.task('lint', function () {
  return gulp.src(lintSrcs)
    .pipe(eslint())
    .pipe(eslint.format());
});

gulp.task('watch', ['delete-dist'], function() {
  process.env.NODE_ENV = 'development';
  Promise.all([
    webpack.watch()//,
    //less.watch()
  ]).then(function() {
    gutil.log('Now that initial assets (js and css) are generated inject will start...');
    inject.watch(postInjectCb);
  }).catch(function(error) {
    gutil.log('Problem generating initial assets (js and css)', error);
  });

  gulp.watch(lintSrcs, ['lint']);
  staticFiles.watch();
  tests.watch();
});

gulp.task('watch-and-serve', ['watch'], function() {
  postInjectCb = stopAndStartServer;
});

var postInjectCb = null;
var serverStarted = false;
function stopAndStartServer() {
  if (serverStarted) {
    gutil.log('Stopping server');
    connect.serverClose();
    serverStarted = false;
  }
  startServer();
}

function startServer() {
  gutil.log('Starting server');
  connect.server({
    root: './dist',
    port: 8080
  });
  serverStarted = true;
}

Let's start picking this apart; what do we actually have here? Well, we have 2 gulp tasks that I want you to notice:

build

This is likely the task you would use when deploying. It takes all of your source code, builds it, provides cache-busting filenames (eg main.dd2fa20cd9eac9d1fb2f.js), injects your shell SPA page with references to the files and deploys everything to the ./dist/ directory. So that's TypeScript, static assets like images and CSS all made ready for Production.

The build task also implements this advice:

When deploying your app, set the NODE_ENV environment variable to production to use the production build of React which does not include the development warnings and runs significantly faster.
watch-and-serve

This task represents "development mode" or "debug mode". It's what you'll likely be running as you develop your app. It does the same as the build task but with some important distinctions.

  • As well as building your source it also runs your tests using Karma
  • This task is not triggered on a once-only basis, rather your files are watched and each tweak of a file will result in a new build and a fresh run of your tests. Nice eh?
  • It spins up a simple web server and serves up the contents of ./dist (i.e. your built code) in order that you can easily test out your app.
  • In addition, whilst it builds your source it does not minify your code and it emits sourcemaps. For why? For debugging! You can go to http://localhost:8080/ in your browser of choice, fire up the dev tools and you're off to the races; debugging like gangbusters. It also doesn't bother to provide cache-busting filenames as Chrome dev tools are smart enough to not cache localhost.
  • Oh and Karma.... If you've got problems with a failing test then head to http://localhost:9876/ and you can debug the tests in your dev tools.
  • Finally, it runs ESLint in the console. Not all of my files are TypeScript; essentially the build process (aka "gulp-y") files are all vanilla JS. So they're easily breakable. ESLint is there to provide a little reassurance on that front.

Now let's dig into each of these in a little more detail

WebPack

Let's take a look at what's happening under the covers of webpack.build() and webpack.watch().

WebPack with ts-loader and babel-loader is what we're using to compile our ES6 TypeScript. ts-loader uses the TypeScript compiler to, um, compile TypeScript and emit ES6 code. This is then passed on to the babel-loader which transpiles it from ES6 down to ES-old-school. It all gets brought together in 2 files; main.js which contains the compiled result of the code written by us and vendor.js which contains the compiled result of 3rd party / vendor files. The reason for this separation is that vendor files are likely to change fairly rarely whilst our own code will constantly be changing. This separation allows for quicker compile times upon file changes as, for the most part, the vendor files will not need to included in this process.

Our gulpfile.js above uses the following task:

'use strict';

var gulp = require('gulp');
var gutil = require('gulp-util');
var webpack = require('webpack');
var WebpackNotifierPlugin = require('webpack-notifier');
var webpackConfig = require('../webpack.config.js');

function buildProduction(done) {
  // modify some webpack config options
  var myProdConfig = Object.create(webpackConfig);
  myProdConfig.output.filename = '[name].[hash].js';

  myProdConfig.plugins = myProdConfig.plugins.concat(
    // make the vendor.js file with cachebusting filename
    new webpack.optimize.CommonsChunkPlugin({ name: 'vendor', filename: 'vendor.[hash].js' }),
    new webpack.optimize.DedupePlugin(),
    new webpack.optimize.UglifyJsPlugin()
  );

  // run webpack
  webpack(myProdConfig, function(err, stats) {
    if(err) { throw new gutil.PluginError('webpack:build', err); }
    gutil.log('[webpack:build]', stats.toString({
      colors: true
    }));

    if (done) { done(); }
  });
}

function createDevCompiler() {
  // show me some sourcemap love people
  var myDevConfig = Object.create(webpackConfig);
  myDevConfig.devtool = 'inline-source-map';
  myDevConfig.debug = true;

  myDevConfig.plugins = myDevConfig.plugins.concat(
    // Make the vendor.js file
    new webpack.optimize.CommonsChunkPlugin({ name: 'vendor', filename: 'vendor.js' }), 
    new WebpackNotifierPlugin({ title: 'Webpack build', excludeWarnings: true })
  );

  // create a single instance of the compiler to allow caching
  return webpack(myDevConfig);
}

function buildDevelopment(done, devCompiler) {
  // run webpack
  devCompiler.run(function(err, stats) {
    if(err) { throw new gutil.PluginError('webpack:build-dev', err); }
    gutil.log('[webpack:build-dev]', stats.toString({
      chunks: false, // dial down the output from webpack (it can be noisy)
      colors: true
    }));

    if (done) { done(); }
  });
}


function bundle(options) {
  var devCompiler;

  function build(done) {
    if (options.shouldWatch) {
      buildDevelopment(done, devCompiler);
    } else {
      buildProduction(done);
    }
  }

  if (options.shouldWatch) {
    devCompiler = createDevCompiler();

    gulp.watch('src/**/*', function() { build(); });
  }

  return new Promise(function(resolve, reject) {
    build(function (err) {
      if (err) {
        reject(err);
      } else {
        resolve('webpack built');
      }
    });
  });
}

module.exports = {
  build: function() { return bundle({ shouldWatch: false }); },
  watch: function() { return bundle({ shouldWatch: true  }); }
};

Hopefully this is fairly self-explanatory; essentially buildDevelopment performs the development build (providing sourcemap support) and buildProduction builds for Production (providing minification support). Both are driven by this webpack.config.js:

/* eslint-disable no-var, strict, prefer-arrow-callback */
'use strict';

var path = require('path');

module.exports = {
  cache: true,
  entry: {
    // The entry point of our application; the script that imports all other scripts in our SPA
    main: './src/main.tsx', 

    // The packages that are to be included in vendor.js
    vendor: [
      'babel-polyfill',
      'events',
      'flux',
      'react'
    ]
  },

  // Where the output of our compilation ends up
  output: {
    path: path.resolve(__dirname, './dist/scripts'),
    filename: '[name].js',
    chunkFilename: '[chunkhash].js'
  },

  module: {
    loaders: [{
      // The loader that handles ts and tsx files.  These are compiled
      // with the ts-loader and the output is then passed through to the
      // babel-loader.  The babel-loader uses the es2015 and react presets
      // in order that jsx and es6 are processed.
      test: /\.ts(x?)$/,
      exclude: /node_modules/,
      loader: 'babel-loader?presets[]=es2015&presets[]=react!ts-loader'
    }, {
      // The loader that handles any js files presented alone.
      // It passes these to the babel-loader which (again) uses the es2015
      // and react presets.
      test: /\.js$/,
      exclude: /node_modules/,
      loader: 'babel',
      query: {
        presets: ['es2015', 'react']
      }
    }]
  },
  plugins: [
  ],
  resolve: {
    // Files with the following extensions are fair game for webpack to process
    extensions: ['', '.webpack.js', '.web.js', '.ts', '.tsx', '.js']
  },
};

Inject

Your compiled output needs to be referenced from some kind of HTML page. So we've got this:

<!doctype html>
<html lang="en">
  <head>
    <meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <meta name="viewport" content="width=device-width, initial-scale=1">

    <title>ES6 + Babel + React + Flux + Karma: The Secret Recipe</title>

    <!-- inject:css -->
    <!-- endinject -->
    <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.5/css/bootstrap.min.css">
    </head>
  <body>
    <div id="content"></div>
    <!-- inject:js -->
    <!-- endinject -->
  </body>
</html>

Which is no more than a boilerplate HTML page with a couple of key features:

  • a single <div /> element in the <body /> which is where our React app is going to be rendered.
  • <!-- inject:css --> and <!-- inject:js --> placeholders where css and js is going to be injected by gulp-inject.
  • a single <link /> to the Bootstrap CDN. This sample app doesn't actually serve up any css generated as part of the project. It could but it doesn't. When it comes to injection time no css will actually be injected. This has been left in place as, more typically, a project would have some styling served up.

This is fed into our inject task in inject.build() and inject.watch(). They take css and javascript and, using our shell template, create a new page which has the css and javascript dropped into their respective placeholders:

'use strict';

var gulp = require('gulp');
var inject = require('gulp-inject');
var glob = require('glob');

function injectIndex(options) {
  var postInjectCb = options.postInjectCb;
  var postInjectCbTriggerId = null;
  function run() {
    var target = gulp.src('./src/index.html');
    var sources = gulp.src([
      //'./dist/styles/main*.css',
      './dist/scripts/vendor*.js',
      './dist/scripts/main*.js'
    ], { read: false });

    return target
      .on('end', function() { // invoke postInjectCb after 1s
        if (postInjectCbTriggerId || !postInjectCb) { return; }

        postInjectCbTriggerId = setTimeout(function() {
          postInjectCb();
          postInjectCbTriggerId = null;
        }, 1000);
      })
      .pipe(inject(sources, { ignorePath: '/dist/', addRootSlash: false, removeTags: true }))
      .pipe(gulp.dest('./dist'));
  }

  var jsCssGlob = 'dist/**/*.{js,css}';

  function checkForInitialFilesThenRun() {
    glob(jsCssGlob, function (er, files) {
      var filesWeNeed = ['dist/scripts/main', 'dist/scripts/vendor'/*, 'dist/styles/main'*/];

      function fileIsPresent(fileWeNeed) {
        return files.some(function(file) {
          return file.indexOf(fileWeNeed) !== -1;
        });
      }

      if (filesWeNeed.every(fileIsPresent)) {
        run('initial build');
      } else {
        checkForInitialFilesThenRun();
      }
    });
  }

  checkForInitialFilesThenRun();

  if (options.shouldWatch) {
    gulp.watch(jsCssGlob, function(evt) {
      if (evt.path && evt.type === 'changed') {
        run(evt.path);
      }
    });
  }
}

module.exports = {
  build: function() { return injectIndex({ shouldWatch: false }); },
  watch: function(postInjectCb) { return injectIndex({ shouldWatch: true, postInjectCb: postInjectCb }); }
};

This also triggers the server to serve up the new content.

Static Files

Your app will likely rely on a number of static assets; images, fonts and whatnot. This script picks up the static assets you've defined and places them in the dist folder ready for use:

'use strict';

var gulp = require('gulp');
var cache = require('gulp-cached');

var targets = [
  // In my own example I don't use any of the targets below, they
  // are included to give you more of a feel of how you might use this
  { description: 'FONTS',   src: './fonts/*',     dest: './dist/fonts' },  
  { description: 'STYLES',  src: './styles/*',    dest: './dist/styles' },
  { description: 'FAVICON', src: './favicon.ico', dest: './dist' },  
  { description: 'IMAGES',  src: './images/*',    dest: './dist/images' }
];

function copy(options) {
  // Copy files from their source to their destination
  function run(target) {
    gulp.src(target.src)
      .pipe(cache(target.description))
      .pipe(gulp.dest(target.dest));
  }

  function watch(target) {
    gulp.watch(target.src, function() { run(target); });
  }

  targets.forEach(run);

  if (options.shouldWatch) {
    targets.forEach(watch);
  }
}

module.exports = {
  build: function() { return copy({ shouldWatch: false }); },
  watch: function() { return copy({ shouldWatch: true }); }
};

Karma

Finally, we're ready to get our tests set up to run continually with Karma. tests.watch() triggers the following task:

'use strict';

var Server = require('karma').Server;
var path = require('path');
var gutil = require('gulp-util');

module.exports = {
  watch: function() {
    // Documentation: https://karma-runner.github.io/0.13/dev/public-api.html
    var karmaConfig = {
      configFile: path.join(__dirname, '../karma.conf.js'),
      singleRun: false,

      plugins: ['karma-webpack', 'karma-jasmine', 'karma-mocha-reporter', 'karma-sourcemap-loader', 'karma-phantomjs-launcher', 'karma-phantomjs-shim'], // karma-phantomjs-shim only in place until PhantomJS hits 2.0 and has function.bind
      reporters: ['mocha']
    };

    new Server(karmaConfig, karmaCompleted).start();

    function karmaCompleted(exitCode) {
      gutil.log('Karma has exited with:', exitCode);
      process.exit(exitCode);
    }
  }
};

When running in watch mode it's possible to debug the tests by going to: http://localhost:9876/. It's also possible to run the tests standalone with a simple npm run test. Running them like this also outputs the results to an XML file in JUnit format; this can be useful for integrating into CI solutions that don't natively pick up test results.

Whichever approach we use for running tests, we use the following karma.conf.js file to configure Karma:

/* eslint-disable no-var, strict */
'use strict';

var webpackConfig = require('./webpack.config.js');

module.exports = function(config) {
  // Documentation: https://karma-runner.github.io/0.13/config/configuration-file.html
  config.set({
    browsers: [ 'PhantomJS' ],

    files: [
      'test/import-babel-polyfill.js', // This ensures we have the es6 shims in place from babel
      'test/**/*.tests.ts',
      'test/**/*.tests.tsx'
    ],

    port: 9876,

    frameworks: [ 'jasmine', 'phantomjs-shim' ],

    logLevel: config.LOG_INFO, //config.LOG_DEBUG

    preprocessors: {
      'test/import-babel-polyfill.js': [ 'webpack', 'sourcemap' ],
      'src/**/*.{ts,tsx}': [ 'webpack', 'sourcemap' ],
      'test/**/*.tests.{ts,tsx}': [ 'webpack', 'sourcemap' ]
    },

    webpack: {
      devtool: 'eval-source-map', //'inline-source-map', - inline-source-map doesn't work at present
      debug: true,
      module: webpackConfig.module,
      resolve: webpackConfig.resolve
    },

    webpackMiddleware: {
      quiet: true,
      stats: {
        colors: true
      }
    },

    // reporter options
    mochaReporter: {
      colors: {
        success: 'bgGreen',
        info: 'cyan',
        warning: 'bgBlue',
        error: 'bgRed'
      }
    },

    junitReporter: {
      outputDir: 'test-results', // results will be saved as $outputDir/$browserName.xml
      outputFile: undefined, // if included, results will be saved as $outputDir/$browserName/$outputFile
      suite: ''
    }
  });
};

As you can see, we're still using our webpack configuration from earlier to configure much of how the transpilation takes place.

And that's it; we have a workflow for developing in TypeScript using React with tests running in an automated fashion. I appreciated this has been a rather long blog post but I hope I've clarified somewhat how this all plugs together and works. Do leave a comment if you think I've missed something.

Babel 5 -> Babel 6

This post has actually been sat waiting to be published for some time. I'd got this solution up and running with Babel 5. Then they shipped Babel 6 and (as is the way with "breaking changes") broke sourcemap support and thus torpedoed this workflow. Happily that's now been resolved. But if you should experience any wonkiness - it's worth checking that you're using the latest and greatest of Babel 6.

Monday, 30 November 2015

IQueryable... IEnumerable... Hmmm...

So there I was, tip-tapping away at my keyboard when I became aware of the slowly loudening noise of a debate. It wasn't about poverty, war, civil rights or anything like that. No; this was far more contentious. It was about the behaviour of IQueryable<T> when mixed with IEnumerable<T>. I know, right, how could I not get involved?

The code that was being debated was a database query that was being facilitated by Entity Framework. Now let me ask you a question: what is the problem with the methods below?

private IEnumerable<Sage> GetSagesWithSayings()
{
    IQueryable<Sage> sageWithSayings = 
        from s in DbContext.Sages.Include(x => x.Sayings)
        select s;

    return sageWithSayings;
}

public IEnumerable<Sage> GetSagesWithSayingsBornWithinTheLast100Years()
{
    var aHundredYearsAgo = DateTime.Now.AddYears(-100);
    var sageWithSayings = GetSagesWithSayings().Where(x => x.DateOfBirth > aHundredYearsAgo);

    return sageWithSayings;
}

I've rather emphasised the problem by expressly declaring types in the GetSagesWithSayings method. More typically the IQueryable<Sage> would be hiding itself beneath a var making the problem less obvious. But you get the point; it's something to do with an IQueryable<Sage> being passed back as an IEnumerable<Sage>.

The debate was raging around what this piece of code (or one much like it) actually did. One side positing "it'll get every record from the database and then throw away what it doesn't need in C#-land..." The opposing view being "are you sure about that? Doesn't it just get the records from the last hundred years from the database?"

So it comes down the SQL that ends up being generated. On the one hand it's going to get everything from the Sages table...

select ... 
from Sages ...

Or does it include a filter clause as well?

select ... 
from Sages ...
where DateOfBirth > '1915-11-30'

You probably know the answer... It gets everything. Every record is brought back from the database and those that are older than 100 years are then casually thrown away. So kinda wasteful. That's the problem. But why? And what does that tell us?

LINQ to Objects vs LINQ to ... ?

The term "LINQ to Objects" refers to the use of LINQ queries with any IEnumerable or IEnumerable<T> collection directly, without the use of an intermediate LINQ provider or API such as LINQ to SQL or LINQ to XML.

The IQueryable<T> interface is intended for implementation by query providers.

This interface inherits the IEnumerable<T> interface so that if it represents a query, the results of that query can be enumerated. Enumeration forces the expression tree associated with an IQueryable<T> object to be executed. Queries that do not return enumerable results are executed when the Execute<TResult>(Expression) method is called.

The definition of "executing an expression tree" is specific to a query provider. For example, it may involve translating the expression tree to a query language appropriate for an underlying data source.

I know - check me out with my "quotes".

Now, IEnumerable and IQueryable are similar; for instance they are both considered "lazy" as they offer deferred execution. But there is an important difference between IEnumerable and IQueryable; namely that IQueryable hands off information about a query to another provider in order that they may decide how to do the necessary work. IEnumerable does not; its work is done in memory by operating on the data it has.

So let's apply this to our issue. We have an IQueryable<Sage> and we return it as an IEnumerable<Sage>. By doing this we haven't changed the underlying type; it's still an IQueryable<Sage>. But by upcasting to IEnumerable<Sage> we have told the compiler that we don't have an IQueryable<Sage>. We've lied. I trust you're feeling guilty.

No doubt whoever raised you told you not to tell lies. This was probably the very situation they had in mind. The implications of our dirty little fib come back to haunt us when we start to chain on subsequent filters. So when we perform our filter of .Where(x => x.DateOfBirth > aHundredYearsAgo) the compiler isn't going to get LINQ to Entities's extension methods in on this. No, it's going to get the LINQ to object extension methods instead.

This is the cause of our problem. When it comes to execution we're not getting the database to do the heavy lifting because we've moved away from using IQueryable.

Fixing the Problem

There are 2 courses of action open to you. The obvious course of action (and 99% of the time what you'd look to do) is change the signature of the method to return an IQueryable like so:

private IQueryable<Sage> GetSagesWithSayings()
    var sageWithSayings = // I prefer 'var', don't you?
        from s in DbContext.Sages.Include(x => x.Sayings)
        select s;

    return sageWithSayings;
}

The other alternative is what I like to think of as "the escape hatch": AsQueryable. This takes an IEnumerable, checks if it's actually an IQueryable slumming it and casts back to that if it is. You might use this in a situation where you didn't have control over the data access code. Using it looks like this: (and would work whether GetSagesWithSayings was returning IEnumerable or IQueryable)

public IEnumerable<Sage> GetSagesWithSayingsBornWithinTheLast100Years()
{
    var aHundredYearsAgo = DateTime.Now.AddYears(-100);
    var sageWithSayings =GetSagesWithSayings().AsQueryable().Where(x => x.DateOfBirth > aHundredYearsAgo);

    return sageWithSayings;
}

Friday, 23 October 2015

The Names Have Been Changed...

...to protect my wallet.

Subsequent to this blog getting a proper domain name a year ago it's now got a new one. That's right, blog.icanmakethiswork.io is dead! Long live blog.johnnyreilly.com!

There's nothing particularly exciting about this, it's more that .io domain names are wayyyyy expensive. And also I noticed that johnnyreilly.com was available. By an accident of history I've ended up either being johnny_reilly or johnnyreilly online. ("[email protected]" was already taken back in 2000 and "[email protected]" was available. I've subsequently become @johnny_reilly on Twitter, johnnyreilly on GitHub so I guess you could say it's stuck.)

So I thought I'd kill 2 birds with one stone and make the switch. I've set up a redirect on blog.icanmakethiswork.io and so, anyone who goes to the old site should be 301'd over here. At least until my old domain name expires. Last time it'll change I promise. Well.... until next time anyway...

Monday, 5 October 2015

jQuery Validation Globalize hits 1.0

This is just a quick post - the tl;dr is this: jQuery Validation Globalize has been ported to Globalize 1.x. Yay! In one of those twists of fate I'm not actually using this plugin in my day job anymore but I thought it might be useful to other people. So here you go. You can read more about this plugin in an older post and you can see a demo of it in action here.

The code did not change drastically - essentially it was just a question of swapping parseFloat for parseNumber and parseDate for a slightly different parseDate. So, we went from this:

(function ($, Globalize) {

    // Clone original methods we want to call into
    var originalMethods = {
        min: $.validator.methods.min,
        max: $.validator.methods.max,
        range: $.validator.methods.range
    };

    // Tell the validator that we want numbers parsed using Globalize

    $.validator.methods.number = function (value, element) {
        var val = Globalize.parseFloat(value);
        return this.optional(element) || ($.isNumeric(val));
    };

    // Tell the validator that we want dates parsed using Globalize

    $.validator.methods.date = function (value, element) {
        var val = Globalize.parseDate(value);
        return this.optional(element) || (val instanceof Date);
    };

    // Tell the validator that we want numbers parsed using Globalize, 
    // then call into original implementation with parsed value

    $.validator.methods.min = function (value, element, param) {
        var val = Globalize.parseFloat(value);
        return originalMethods.min.call(this, val, element, param);
    };

    $.validator.methods.max = function (value, element, param) {
        var val = Globalize.parseFloat(value);
        return originalMethods.max.call(this, val, element, param);
    };

    $.validator.methods.range = function (value, element, param) {
        var val = Globalize.parseFloat(value);
        return originalMethods.range.call(this, val, element, param);
    };

}(jQuery, Globalize));

To this:

(function ($, Globalize) {

    // Clone original methods we want to call into
    var originalMethods = {
        min: $.validator.methods.min,
        max: $.validator.methods.max,
        range: $.validator.methods.range
    };

    // Globalize options - initially just the date format used for parsing
    // Users can customise this to suit them
    $.validator.methods.dateGlobalizeOptions = { dateParseFormat: { skeleton: "yMd" } };

    // Tell the validator that we want numbers parsed using Globalize
    $.validator.methods.number = function (value, element) {
        var val = Globalize.parseNumber(value);
        return this.optional(element) || ($.isNumeric(val));
    };

    // Tell the validator that we want dates parsed using Globalize
    $.validator.methods.date = function (value, element) {
        var val = Globalize.parseDate(value, $.validator.methods.dateGlobalizeOptions.dateParseFormat);
        return this.optional(element) || (val instanceof Date);
    };

    // Tell the validator that we want numbers parsed using Globalize,
    // then call into original implementation with parsed value

    $.validator.methods.min = function (value, element, param) {
        var val = Globalize.parseNumber(value);
        return originalMethods.min.call(this, val, element, param);
    };

    $.validator.methods.max = function (value, element, param) {
        var val = Globalize.parseNumber(value);
        return originalMethods.max.call(this, val, element, param);
    };

    $.validator.methods.range = function (value, element, param) {
        var val = Globalize.parseNumber(value);
        return originalMethods.range.call(this, val, element, param);
    };

}(jQuery, Globalize));

All of which is pretty self-explanatory. The only thing I'd like to draw out is that Globalize 0.1.x didn't force you to specify a date parsing format and, as I recall, would attempt various methods of parsing. For that reason jQuery Validation Globalize 1.0 exposes a $.validator.methods.dateGlobalizeOptions which allows you to specify the data parsing format you want to use. This means, should you be using a different format than the out of the box one then you can tweak it like so:

$.validator.methods.dateGlobalizeOptions.dateParseFormat = // your data parsing format goes here...

Theoretically, this functionality could be tweaked to allow the user to specify multiple possible date parsing formats to attempt. I'm not certain if that's a good idea though, so it remains unimplemented for now.

Wednesday, 23 September 2015

Definitely Typed Shouldn't Exist

OK - the title's total clickbait but stay with me; there's a point here.

I'm a member of the Definitely Typed team - and hopefully I won't be kicked out for writing this. My point is this: .d.ts files should live with the package they provide typing information for, in npm / GitHub etc. Not separately. TypeScript 1.6 has just been released. Yay! In the release blog post it says this:

We’ve changed module resolution when doing CommonJS output to work more closely to how Node does module resolution. If a module name is non-relative, we now follow these steps to find the associated typings:

  1. Check in node_modules for <module name>.d.ts
  2. Search node_modules\<module name>\package.json for a typings field
  3. Look for node_modules\<module name>\index.d.ts
  4. Then we go one level higher and repeat the process

Please note: when we search through node_modules, we assume these are the packaged node modules which have type information and a corresponding .js file. As such, we resolve only .d.ts files (not .ts file) for non-relative names.

Previously, we treated all module names as relative paths, and therefore we would never properly look in node_modules... We will continue to improve module resolution, including improvements to AMD, in upcoming releases.

The TL;DR is this: consuming npm packages which come with definition files should JUST WORK™... npm is now a first class citizen in TypeScriptLand. So everyone who has a package on npm should now feel duty bound to include a .d.ts when they publish and Definitely Typed can shut up shop. Simple right?

Wrong!

Yeah, it's never going to happen. Surprising as it is, there are many people who are quite happy without TypeScript in their lives (I know - mad right?). These poor unfortunates are unlikely to ever take the extra steps necessary to write definition files. For this reason, there will probably always be a need for a provider of typings such as Definitely Typed. As well as that, the vast majority of people using TypeScript probably don't use npm to manage dependencies. There are, however, an increasing number of users who are using npm. Some (like me) may even be using tools like Browserify (with the TSIFY plugin) or WebPack (with the TS loader) to bring it all together. My feeling is that, over time, using npm will become more common; particularly given the improvements being made to module resolution in the language.

An advantage of shipping typings with an npm package is this: those typings should accurately describe their accompanying package. In Definitely Typed we only aim to support the latest and greatest typings. So if you find yourself looking for the typings of an older version of a package you're going to have to pick your way back through the history of a .d.ts file and hope you happen upon the version you're looking for. Not a fantastic experience.

So I guess what I'm saying is this: if you're an npm package author then it would be fantastic to start shipping a package with typings in the box. If you're using npm to consume packages then using Definitely Typed ought to be the second step you might take after installing a package; the step you only need to take if the package doesn't come with typings. Using DT should be a fallback, not a default.

Authoring npm modules with TypeScript

Yup - that's what this post is actually about. See how I lured you in with my mild trolling and pulled the old switcheroo? That's edutainment my friend. So, how do we write npm packages in TypeScript and publish them with their typings? Apparently Gandhi didn't actually say "Be the change you wish to see in the world." Which is a shame. But anyway, I'm going to try and embrace the sentiment here.

Not so long ago I wrote a small npm module called globalize-so-what-cha-want. It is used to determine what parts of Globalize 1.x you need depending on the modules you're planning to use. It also, contains a little demo UI / online tool written in React which powers this.

For this post, the purpose of the package is rather irrelevant. And even though I've just told you about it, I want you to pretend that the online tool doesn't exist. Pretend I never mentioned it.

What is relevant, and what I want you to think about, is this: I wrote globalize-so-what-cha-want in plain old, honest to goodness JavaScript. Old school.

But, my love of static typing could be held in abeyance for only so long. Once the initial package was written, unit tested and published I got the itch. THIS SHOULD BE WRITTEN IN TYPESCRIPT!!! Well, it didn't have to be but I wanted it to be. Despite having used TypeScript since the early days I'd only been using it for front end work; not for writing npm packages. My mission was clear: port globalize-so-what-cha-want to TypeScript and re-publish to npm.

Port, port, port!!!

At this point globalize-so-what-cha-want consisted of a single index.js file in the root of the package. My end goal was to end up with that file still sat there, but now generated from TypeScript. Alongside it I wanted to see a index.d.ts which was generated from the same TypeScript.

index.js before looked like this:

/* jshint varstmt: false, esnext: false */
var DEPENDENCY_TYPES = {
  SHARED_JSON: 'Shared JSON (used by all locales)',
  LOCALE_JSON: 'Locale specific JSON (supplied for each locale)'
};

var moduleDependencies = {
  'core': {
    dependsUpon: [],
    cldrGlobalizeFiles: ['cldr.js', 'cldr/event.js', 'cldr/supplemental.js', 'globalize.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/likelySubtags.json' }
    ]
  },

  'currency': {
    dependsUpon: ['number','plural'],
    cldrGlobalizeFiles: ['globalize/currency.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/currencies.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/currencyData.json' }
    ]
  },

  'date': {
    dependsUpon: ['number'],
    cldrGlobalizeFiles: ['globalize/date.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/ca-gregorian.json' },
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/timeZoneNames.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/timeData.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/weekData.json' }
    ]
  },

  'message': {
    dependsUpon: ['core'],
    cldrGlobalizeFiles: ['globalize/message.js'],
    json: []
  },

  'number': {
    dependsUpon: ['core'],
    cldrGlobalizeFiles: ['globalize/number.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/numbers.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/numberingSystems.json' }
    ]
  },

  'plural': {
    dependsUpon: ['core'],
    cldrGlobalizeFiles: ['globalize/plural.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/plurals.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/ordinals.json' }
    ]
  },

  'relativeTime': {
    dependsUpon: ['number','plural'],
    cldrGlobalizeFiles: ['globalize/relative-time.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/dateFields.json' }
    ]
  }
};

function determineRequiredCldrData(globalizeOptions) {
  return determineRequired(globalizeOptions, _populateDependencyCurrier('json', function(json) { return json.dependency; }));
}

function determineRequiredCldrGlobalizeFiles(globalizeOptions) {
  return determineRequired(globalizeOptions, _populateDependencyCurrier('cldrGlobalizeFiles', function(cldrGlobalizeFile) { return cldrGlobalizeFile; }));
}

function determineRequired(globalizeOptions, populateDependencies) {
  var modules = Object.keys(globalizeOptions);
  modules.forEach(function(module) {
    if (!moduleDependencies[module]) {
      throw new TypeError('There is no \'' + module + '\' module');
    }
  });

  var requireds = [];
  modules.forEach(function (module) {
    if (globalizeOptions[module]) {
      populateDependencies(module, requireds);
    }
  });

  return requireds;
}

function _populateDependencyCurrier(requiredArray, requiredArrayGetter) {
  var popDepFn = function(module, requireds) {
    var dependencies = moduleDependencies[module];

    dependencies.dependsUpon.forEach(function(requiredModule) {
      popDepFn(requiredModule, requireds);
    });

    dependencies[requiredArray].forEach(function(required) {
      var newRequired = requiredArrayGetter(required);
      if (requireds.indexOf(newRequired) === -1) {
        requireds.push(newRequired);
      }
    });

    return requireds;
  };

  return popDepFn;
}

module.exports = {
  determineRequiredCldrData: determineRequiredCldrData,
  determineRequiredCldrGlobalizeFiles: determineRequiredCldrGlobalizeFiles
};

You can even kind of tell that it was written in JavaScript thanks to the jshint rules at the top.

I fired up Atom and created a new folder src/lib and inside there I created index.ts (yes, index.js renamed) and tsconfig.json. By the way, you'll notice I'm not leaving Atom - I'm making use of the magnificent atom-typescript which you should totally be using too. It rocks.

Now I'm not going to bore you with what I had to do to port the JS to TS (not much). If you're interested, the source is here. What's more interesting is the tsconfig.json - as it's this that is going to lead the generation of the JS and TS that we need:

{
    "compileOnSave": true,
    "compilerOptions": {
        "module": "commonjs",
        "declaration": true,
        "target": "es5",
        "noImplicitAny": true,
        "suppressImplicitAnyIndexErrors": true,
        "removeComments": false,
        "preserveConstEnums": true,
        "sourceMap": false,
        "outDir": "../../"
    },
    "files": [
        "index.ts"
    ]
}

The things to notice are:

module
Publishing a commonjs module means it will play well with npm
declaration
This is what makes TypeScript generate index.d.ts
outDir
We want to regenerate the index.js in the root (2 directories above this)

So now, what do we get when we build in Atom? Well, we're generating an index.js file which looks like this:

var DEPENDENCY_TYPES = {
  SHARED_JSON: 'Shared JSON (used by all locales)',
  LOCALE_JSON: 'Locale specific JSON (supplied for each locale)'
};
var moduleDependencies = {
  'core': {
    dependsUpon: [],
    cldrGlobalizeFiles: ['cldr.js', 'cldr/event.js', 'cldr/supplemental.js', 'globalize.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/likelySubtags.json' }
    ]
  },
  'currency': {
    dependsUpon: ['number', 'plural'],
    cldrGlobalizeFiles: ['globalize/currency.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/currencies.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/currencyData.json' }
    ]
  },
  'date': {
    dependsUpon: ['number'],
    cldrGlobalizeFiles: ['globalize/date.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/ca-gregorian.json' },
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/timeZoneNames.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/timeData.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/weekData.json' }
    ]
  },
  'message': {
    dependsUpon: ['core'],
    cldrGlobalizeFiles: ['globalize/message.js'],
    json: []
  },
  'number': {
    dependsUpon: ['core'],
    cldrGlobalizeFiles: ['globalize/number.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/numbers.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/numberingSystems.json' }
    ]
  },
  'plural': {
    dependsUpon: ['core'],
    cldrGlobalizeFiles: ['globalize/plural.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/plurals.json' },
      { dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/ordinals.json' }
    ]
  },
  'relativeTime': {
    dependsUpon: ['number', 'plural'],
    cldrGlobalizeFiles: ['globalize/relative-time.js'],
    json: [
      { dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/dateFields.json' }
    ]
  }
};
function determineRequired(globalizeOptions, populateDependencies) {
  var modules = Object.keys(globalizeOptions);
  modules.forEach(function (module) {
    if (!moduleDependencies[module]) {
      throw new TypeError('There is no \'' + module + '\' module');
    }
  });
  var requireds = [];
  modules.forEach(function (module) {
    if (globalizeOptions[module]) {
      populateDependencies(module, requireds);
    }
  });
  return requireds;
}
function _populateDependencyCurrier(requiredArray, requiredArrayGetter) {
  var popDepFn = function (module, requireds) {
    var dependencies = moduleDependencies[module];
    dependencies.dependsUpon.forEach(function (requiredModule) {
      popDepFn(requiredModule, requireds);
    });
    dependencies[requiredArray].forEach(function (required) {
      var newRequired = requiredArrayGetter(required);
      if (requireds.indexOf(newRequired) === -1) {
        requireds.push(newRequired);
      }
    });
    return requireds;
  };
  return popDepFn;
}
/**
 * The string array returned will contain a list of the required cldr json data you need. I don't believe ordering matters for the json but it is listed in the same dependency order as the cldr / globalize files are.
 *
 * @param options The globalize modules being used.
 */
function determineRequiredCldrData(globalizeOptions) {
  return determineRequired(globalizeOptions, _populateDependencyCurrier('json', function (json) { return json.dependency; }));
}
exports.determineRequiredCldrData = determineRequiredCldrData;
/**
 * The string array returned will contain a list of the required cldr / globalize files you need, listed in the order they are required.
 *
 * @param options The globalize modules being used.
 */
function determineRequiredCldrGlobalizeFiles(globalizeOptions) {
  return determineRequired(globalizeOptions, _populateDependencyCurrier('cldrGlobalizeFiles', function (cldrGlobalizeFile) { return cldrGlobalizeFile; }));
}
exports.determineRequiredCldrGlobalizeFiles = determineRequiredCldrGlobalizeFiles;

Aside from one method moving internally and me adding some JSDoc, the only really notable change is the end of the file. TypeScript, when generating commonjs, doesn't use the module.exports = {} approach. Rather, it drops exported functions onto the exports object as functions are exported. Functionally this is identical.

Now for our big finish: happily sat alongside is index.js is the index.d.ts file:

export interface Options {
  currency?: boolean;
  date?: boolean;
  message?: boolean;
  number?: boolean;
  plural?: boolean;
  relativeTime?: boolean;
}
/**
 * The string array returned will contain a list of the required cldr json data you need. I don't believe ordering matters for the json but it is listed in the same dependency order as the cldr / globalize files are.
 *
 * @param options The globalize modules being used.
 */
export declare function determineRequiredCldrData(globalizeOptions: Options): string[];
/**
 * The string array returned will contain a list of the required cldr / globalize files you need, listed in the order they are required.
 *
 * @param options The globalize modules being used.
 */
export declare function determineRequiredCldrGlobalizeFiles(globalizeOptions: Options): string[];

We're there, huzzah! This has been now published to npm - anyone consuming this package can use TypeScript straight out of the box. I really hope that publishing npm packages in this fashion becomes much more commonplace. Time will tell.

PS I'm not the only one

I was just about to hit "publish" when I happened upon Basarat's ts-npm-module which is a project on GitHub which demo's how to publish and consume TypeScript using npm. I'd say great minds think alike but I'm pretty sure Basarat's mind is far greater than mine! (Cough, atom-typescript, cough.) Either way, it's good to see validation for the approach I'm suggesting.

PPS Update 23/09/2015 09:51

One of the useful things about writing a blog is that you get to learn. Since I published I've become aware of a few things somewhat relevant to this post. First of all, there is still work ongoing in TypeScript land around this topic. Essentially there are problems resolving dependency conflicts when different dependencies have different versions - you can take part in the ongoing discussion here. There's also some useful resources to look at:

Thursday, 10 September 2015

Things Done Changed

Some people fear change. Most people actually. I'm not immune to that myself, but not in the key area of technology. Any developer that fears change when it comes to the tools and languages that he / she is using is in the wrong business. Because what you're using to cut code today will not last. The language will evolve, the tools and frameworks that you love will die out and be replaced by new ones that are different and strange. In time, the language you feel you write as a native will fall out of favour, replaced by a new upstart.

My first gig was writing telecoms software using Delphi. I haven't touched Delphi (or telecoms for that matter) for over 10 years now. Believe me, I grok that things change.

That is the developer's lot. If you're able to accept that then you'll be just fine. For my part I've always rather relished the new and so I embrace it. However, I've met a surprising number of devs that are outraged when they realise that the language and tools they have used since their first job are not going to last. They do not go gentle into that good dawn. They rage, rage against the death of WebForms. My apologies to Dylan Thomas.

I recently started a new contract. This always brings a certain amount of change. This is part of the fun of contracting. However, the change was more significant in this case. As a result, the tools that I've been using for the last couple of months have been rather different to those that I'm used to. I've been outside my comfort zone. I've loved it. And now I want to reflect upon it. Because, in the words of Socrates, "the unexamined life is not worth living".

The Shock of the New (Toys)

I'd been brought in to work on a full stack ASP.Net project. However, I've initially been working on a separate project which is entirely different. A web client app which has nothing to do with ASP.Net at all. It's a greenfield app which is built using the following:

  1. React / Flux
  2. WebSockets / Protocol Buffers
  3. Browserify
  4. ES6 with Babel
  5. Karma
  6. Gulp
  7. Atom

Where to begin? Perhaps at the end - Atom.

How Does it Feel to be on Your Own?

When all around you, as far as the eye can see, are monitors displaying Visual Studio in all its grey glory whilst I was hammering away on Atom. It felt pretty good actually.

The app I was working on was a React / Flux app. You know what that means? JSX! At the time the project began Visual Studio did not have good editor support for JSX (something that the shipping of VS 2015 may have remedied but I haven't checked). So, rather than endure a life dominated with red squigglies I jumped ship and moved over to using GitHub's Atom to edit code.

I rather like it. Whilst VS is a full blown IDE, Atom is a text editor. A very pretty one. And crucially, one that can be extended by plugins of which there is a rich ecosystem. You want JSX support? There's a plugin for that. You want something that formats JSON nicely for you? There's a plugin for that too.

My only criticism of Atom really is that it doesn't handle large files well and it crashes a lot. I'm quite surprised by both of these characteristics given that in contrast to VS it is so small. You'd think the basics would be better covered. Go figure. It still rocks though. It looks so sexy - how could I not like it?

Someone to watch over me

I've been using Gulp for a while now. It's a great JavaScript task runner and incredibly powerful. Previously though, I've used it as part of a manual build step (even plumbing it into my csproj). With the current project I've moved over to using the watch functionality of gulp. So I'm scrapping triggering gulp tasks manually. Instead we have gulp configured to gaze lovingly at the source files and, when they change, re-run the build steps.

This is nice for a couple of reasons:

  • When I want to test out the app the build is already done - I don't have to wait for it to happen.
  • When I do bad things I find out faster. So I've got JSHint being triggered by my watch. If I write code that makes JSHint sad (and I haven't noticed the warnings from the atom plugin) then they'll appear in the console. Likewise, my unit tests are continuously running in response to file changes (in an ncrunch-y sorta style) and so I know straight away if I'm breaking tests. Rather invaluable in the dynamic world of JavaScript.

Karma, Karma, Karma, Chameleon

In the past, when using Visual Studio, it made sense to use the mighty Chutzpah which allows you to run JS unit tests from within VS itself. I needed a new way to run my Jasmine unit tests. The obvious choice was Karma (built by the Angular team I think?). It's really flexible.

You're using Browserify? No bother. You're writing ES6 and transpiling with Babel? Not a problem. You want code coverage? That we can do. You want an integration for TeamCity? That too is sorted....

Karma is fantastic. Fun fact: originally it was called Testacular. I kind of get why they changed the name but the world is all the duller for it. A side effect of the name change is that due to invalid search results I know a lot more about Buddhism than I used to.

I cry Babel, Babel, look at me now

Can you not wait for the future? Me neither. Even though it's 2015 and Back to the Future II takes place in only a month's time. So I'm not waiting for a million and one browsers to implement ES6 and IE 8 to finally die dammit. Nope, I have a plan and it's Babel. Transpile the tears away, write ES6 and have Babel spit out EStoday.

I've found this pretty addictive. Once you've started using ES6 features it's hard to go back. Take destructuring - I can't get enough of it.

Whilst I love Babel, it has caused me some sadness as well. My beloved TypeScript is currently not in the mix, Babel is instead sat squarely where I want TS to be. I'm without static types and rather bereft. You can certainly live without them but having done so for a while I'm pretty clear now that static typing is a massive productivity win. You don't have to hold the data structures that you're working on in your head so much, code completion gives you what you need there, you can focus more on the problem that you're trying to solve. You also burn less time on silly mistakes. If you accidentally change the return type of a function you're likely to know straight away. Refactoring is so much harder without static types. I could go on.

All this goes to say: I want my static typing back. It wasn't really an option to use TypeScript in the beginning because we were using JSX and TS didn't support it. However! TypeScript is due to add support for JSX in TS 1.6 (currently in beta). I've plans to see if I can get TypeScript to emit ES6 and then keep using Babel to do the transpilation. Whether this will work, I don't know yet. But it seems likely. So I'm hoping for a bright future.

Browserify (there are no song lyrics that can be crowbarred in)

Emitting scripts in the required order has been a perpetual pain for everyone in the web world for the longest time. I've taken about 5 different approaches to solving this problem over the years. None felt particularly good. So Browserify.

Browserify solves the problem of script ordering for you by allowing you to define an entry point to your application and getting you to write require (npm style) or import (ES6 modules) to bring in the modules that you need. This allows Browserify (which we're using with Babel thanks to the babelify transform) to create a ginormous js file that contains the scripts served as needed. Thanks to the magic of source maps it also allows us to debug our original code (yup, the original ES6!) Browserify has the added bonus of allowing us free reign to pull in npm packages to use in our app without a great deal of ceremony.

Browserify is pretty fab - my only real reservation is that if you step outside the normal use cases you can quickly find yourself in deep water. Take for instance web workers. We were briefly looking into using them as an optimisation (breaking IO onto a separate process from the UI). A prime reason for backing away from this is that Web Workers don't play particularly well with Browserify. And when you've got Babel (or Babelify) in the mix the problems just multiply. That apart, I really dig Browserify. I think I'd like to give WebPack a go as well as I understand it fulfills a similar purpose.

WebSockets / Protocol Buffers

The application I'm working on is plugging into an existing system which uses WebSockets for communication. Since WebSockets are native to the web we've been able to plumb these straight into our app. We're also using Protocol Buffers as another optimisation; a way to save a few extra bytes from going down the wire. I don't have much to say about either, just some observations really:

  1. WebSockets is a slightly different way of working - permanently open connections as opposed to the request / response paradigm of classic HTTP
  2. WebSockets are wicked fast (due in part to those permanent connections). So performance is amazing. Fast like native, type amazing. In our case performance is pretty important and so this has been really great.

React / Flux

Finally, React and Flux. I was completely new to these when I came onto the project and I quickly came to love them. There was a prejudice for me to overcome and that was JSX. When I first saw it I felt a little sick. "Have we learned NOTHING???" I wailed. "Don't we know that embedding strings in our controllers is a BAD thing?" I was wrong. I had an epiphany. I discovered that JSX is not, as I first imagined, embedded HTML strings. Actually it's syntactic sugar for object creation. A simple example:

var App;

// Some JSX:
var app = <App version="1.0.0" />;

// What that JSX transpiles into:
var app = React.createElement(App, {version:"1.0.0"});

Now that I'm well used to JSX and React I've really got to like it. I keep my views / components as dumb as possible and do all the complex logic in the stores. The stores are just standard JavaScript and so, pretty testable (simple Jasmine gives you all you need - I haven't felt the need for Jest). The components / views are also completely testable. I'd advise anyone coming to React afresh to make use of the ReactShallowRenderer for these purposes. This means you can test without a DOM - much better all round.

I don't have a great deal to say about Flux; I think my views on it aren't fully formed yet. I do really like predictability that unidirectional data flow gives you. However, I'm mindful that the app that I've been writing is very much about displaying data and doesn't require much user input. I know that I'm living without 2-way data binding and I do wonder if I would come to miss it. Time will tell.

I really want to get back to static typing. That either means TypeScript (which I know and love) or Facebook's Flow. (A Windows version of Flow is in the works.) I'll be very happy if I get either into the mix... Watch this space.

Thursday, 13 August 2015

(Top One, Nice One) Get Sorted

I was recently reading a post by Jaime González García which featured the following mind-bending proposition:

What if I told you that JavaScript has LINQ??

It got me thinking about one of favourite features of LINQ: ordering using OrderBy, ThenBy... The ability to simply expose a collection of objects in a given order with a relatively terse and descriptive syntax. It is fantastically convenient, expressive and something I've been missing in JavaScript. But if Jaime is right... Well, let's see what we can do.

Sort

JavaScript arrays have a sort method. To quote MDN:

arr.sort([compareFunction])
compareFunction

Optional. Specifies a function that defines the sort order. If omitted, the array is sorted according to each character's Unicode code point value, according to the string conversion of each element.

We want to use the sort function to introduce some LINQ-ish ordering goodness. Sort of. See what I did there?

Before we get going it's worth saying that LINQ's OrderBy and JavaScript's sort are not the same thing. sort actually changes the order of the array. However, OrderBy returns an IOrderedEnumerable which when iterated returns the items of the collection in a particular order. An important difference. If preserving the original order of my array was important to me (spoiler: mostly it isn't) then I could make a call to slice prior to calling sort.

sort also returns the array to the caller which is nice for chaining and means we can use it in a similar fashion to the way we use OrderBy. With that in mind, we're going to create comparer functions which will take a lambda / arrow function (ES6 alert!) and return a function which will compare based on the supplied lambda.

String Comparer

Let's start with ordering by string properties:

function stringComparer(propLambda) {
  return (obj1, obj2) => {
    const obj1Val = propLambda(obj1) || '';
    const obj2Val = propLambda(obj2) || '';
    return obj1Val.localeCompare(obj2Val);
  };
}

We need some example data to sort: (I can only apologise for my lack of inspiration here)

const foodInTheHouse = [
  { what: 'cake',   daysSincePurchase: 2 },
  { what: 'apple',  daysSincePurchase: 8 },
  { what: 'orange', daysSincePurchase: 6 },
  { what: 'apple',  daysSincePurchase: 2 },
];

If we were doing a sort by strings in LINQ we wouldn't need to implement our own comparer. And the code we'd write would look something like this:

var foodInTheHouseSorted = foodInTheHouse.OrderBy(x => x.what);

With that in mind, here's how it would look to use our shiny and new stringComparer:

const foodInTheHouseSorted = foodInTheHouse.sort(stringComparer(x => x.what));

// foodInTheHouseSorted: [
//   { what: 'apple',  daysSincePurchase: 8 },
//   { what: 'apple',  daysSincePurchase: 2 },
//   { what: 'cake',   daysSincePurchase: 2 },
//   { what: 'orange', daysSincePurchase: 6 }
// ]

// PS Don't forget, for our JavaScript: foodInTheHouse === foodInTheHouseSorted
//    But for the LINQ:                 foodInTheHouse !=  foodInTheHouseSorted
//    
//    However, if I'd done this:

const foodInTheHouseSlicedAndSorted = foodInTheHouse.slice().sort(stringComparer(x => x.what));

//    then:                             foodInTheHouse !== foodInTheHouseSlicedAndSorted 
//
//    I shan't mention this again.

Number Comparer

Well that's strings sorted (quite literally). Now, what about numbers?

function numberComparer(propLambda) {
  return (obj1, obj2) => {
    const obj1Val = propLambda(obj1);
    const obj2Val = propLambda(obj2);
    if (obj1Val > obj2Val) {
      return 1;
    }
    else if (obj1Val < obj2Val) {
      return -1;
    }
    return 0;
  };
}

If we use the numberComparer on our original array it looks like this:

const foodInTheHouseSorted = foodInTheHouse.sort(numberComparer(x => x.daysSincePurchase));

// foodInTheHouseSorted: [
//   { what: 'cake',   daysSincePurchase: 2 },
//   { what: 'apple',  daysSincePurchase: 2 },
//   { what: 'orange', daysSincePurchase: 6 },
//   { what: 'apple',  daysSincePurchase: 8 }
// ]

Descending Into the Pit of Success

Well this is all kinds of fabulous. But something's probably nagging at you... What about OrderByDescending? What about when I want to sort in the reverse order? May I present the reverse function:

function reverse(comparer) {
  return (obj1, obj2) => comparer(obj1, obj2) * -1;
}

As the name suggests, this function takes a given comparer that's handed to it and returns a function that inverts the results of executing that comparer. Clear as mud? A comparer can return 3 types of return values:

  • 0 - implies equality for obj1 and obj2
  • positive - implies obj1 is greater than obj2 by the ordering criterion
  • negative - implies obj1 is less than obj2 by the ordering criterion

Our reverse function takes the comparer it is given and returns a new comparer that will return a positive value where the old one would have returned a negative and vica versa. (Equality is unaffected.) An alternative implementation would have been this:

function reverse(comparer) {
  return (obj1, obj2) => comparer(obj2, obj1);
}

Which is more optimal and even simpler as it just swaps the values supplied to the comparer. Whatever tickles your fancy. Either way, when used it looks like this:

const foodInTheHouseSorted = foodInTheHouse.sort(reverse(stringComparer(x => x.what)));

// foodInTheHouseSorted: [
//   { what: 'orange', daysSincePurchase: 6 },
//   { what: 'cake',   daysSincePurchase: 2 },
//   { what: 'apple',  daysSincePurchase: 8 },
//   { what: 'apple',  daysSincePurchase: 2 }
// ]

If you'd rather not have a function wrapping a function inline then you could create stringComparerDescending, a numberComparerDescending etc implementations. Arguably it might make for a nicer API. I'm not unhappy with the present approach myself and so I'll leave it as is. But it's an option.

ThenBy

So far we can sort arrays by strings, we can sort arrays by numbers and we can do either in descending order. It's time to take it to the next level people. That's right ThenBy; I want to be able to sort by one criteria and then by a subcriteria. So perhaps I want to eat the food in the house in alphabetical order, but if I have multiple apples I want to eat the ones I bought most recently first (because the other ones look old, brown and yukky). This may also be a sign I haven't thought my life through, but it's a choice that people make. People that I know. People I may have married.

It's time to compose our comparers together. May I present... drum roll.... the composeComparers function:

function composeComparers(...comparers) {
  return (obj1, obj2) => {
    const comparer = comparers.find(c => c(obj1, obj2) !== 0);
    return (comparer) ? comparer(obj1, obj2) : 0;
  };
}

This fine function takes any number of comparers that have been supplied to it. It then returns a comparer function which, when called, iterates through each of the original comparers and executes them until it finds one that returns a value that is not 0 (ie represents that the 2 items are not equal). It then sends that non-zero value back or if all was equal then sends back 0.

const foodInTheHouseSorted = foodInTheHouse.sort(composeComparers(
    stringComparer(x => x.what),
    numberComparer(x => x.daysSincePurchase),
));

// foodInTheHouseSorted: [
//   { what: 'apple',  daysSincePurchase: 2 },
//   { what: 'apple',  daysSincePurchase: 8 },
//   { what: 'cake',   daysSincePurchase: 2 },
//   { what: 'orange', daysSincePurchase: 6 }
// ]

composeComparers: The Sequel

I'm not gonna lie - I was feeling quite pleased with this approach. I shared it with my friend (and repeated colleague) Peter Foldi. The next day I found this in my inbox:

function composeComparers(...comparers) {
  return (obj1, obj2) => comparers.reduce((prev, curr) => prev || curr(obj1, obj2), 0);
}

Dammit he's improved it. It's down to 1 line of code, it doesn't execute a non-zero returning comparer twice and it doesn't rely on find which only arrives with ES6. So if you wanted to backport to ES5 then this is a better choice.

The only criticism I can make of it is that it iterates through each of the comparers even when it doesn't need to execute them. But that's just carping really.

composeComparers: The Ultimate

So naturally I thought I was done. Showing Peter's improvements to the estimable Matthew Horsley I learned that this was not so. Because he reached for the keyboard and entered this:

function composeComparers(...comparers) {
  // README: https://wiki.haskell.org/Function_composition
  return comparers.reduce((prev, curr) => (a, b) => prev(a, b) || curr(a, b));
}

That's right, he's created a function which takes a number of comparers and reduced them up front into a single comparer function. This means that when the sort takes place there is no longer a need to iterate through the comparers, just execute them.

I know.

I'll get my coat...

Thursday, 30 July 2015

Upgrading to Globalize 1.x for Dummies

Globalize has hit 1.0. Anyone who reads my blog will likely be aware that I'm a long time user of Globalize 0.1.x. I've been a little daunted by the leap that the move from 0.1.x to 1.x represents. It appears to be the very definition of "breaking changes". :-) But hey, this is Semantic Versioning being used correctly so how could I complain? Either way, I've decided to write up the migration here as I'm not expecting this to be easy.

To kick things off I've set up a very simple repo that consists of a single page that depends upon Globalize 0.1.x to render a number and a date in German. It looks like this:

<html>
  <head>
    <title>Globalize demo</title>
    <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.5/css/bootstrap.min.css" rel="stylesheet">
  </head>
  <body>
    <div class="container-fluid">
      <h4>Globalize demo for the <em id="locale"></em> culture / locale</h4>
      <p>This is a the number <strong id="number"></strong> formatted by Globalize: 
          <strong id="numberFormatted"></strong></p>
      <p>This is a the number <strong id="date"></strong> formatted by Globalize: 
          <strong id="dateFormatted"></strong></p>
    </div>

    <script src="bower_components/globalize/lib/globalize.js"></script>
    <script src="bower_components/globalize/lib/cultures/globalize.culture.de-DE.js"></script>
    <script>
      var locale = 'de-DE';
      var number = 12345.67;
      var date = new Date(2012, 5, 15);

      Globalize.culture( locale );
      document.querySelector('#locale').innerText = locale;
      document.querySelector('#number').innerText = number;
      document.querySelector('#date').innerText = date;
      document.querySelector('#numberFormatted').innerText = Globalize.format(number, 'n2');
      document.querySelector('#dateFormatted').innerText = Globalize.format(date, 'd');
    </script>
  </body>
</html>

When it's run it looks like this:

Let's see how we go about migrating this super simple example.

Update our Bower dependencies

First things first, we want to move Globalize from 0.1.x to 1.x using Bower. To do that we update our bower.json:

  "dependencies": {
    "globalize": "^1.0.0"
  }

Now we enter: bower update. And we're off!

bower globalize#^1.0.0          cached git://github.com/jquery/globalize.git#1.0.0
bower globalize#^1.0.0        validate 1.0.0 against git://github.com/jquery/globalize.git#^1.0.0
bower cldr-data#>=25            cached git://github.com/rxaviers/cldr-data-bower.git#27.0.3
bower cldr-data#>=25          validate 27.0.3 against git://github.com/rxaviers/cldr-data-bower.git#>=25
bower cldrjs#0.4.1              cached git://github.com/rxaviers/cldrjs.git#0.4.1
bower cldrjs#0.4.1            validate 0.4.1 against git://github.com/rxaviers/cldrjs.git#0.4.1
bower globalize#^1.0.0         install globalize#1.0.0
bower cldr-data#>=25           install cldr-data#27.0.3
bower cldrjs#0.4.1             install cldrjs#0.4.1

globalize#1.0.0 bower_components\globalize
├── cldr-data#27.0.3
└── cldrjs#0.4.1

cldr-data#27.0.3 bower_components\cldr-data

cldrjs#0.4.1 bower_components\cldrjs
└── cldr-data#27.0.3

This all looks happy enough. Except it's actually not.

We need fuel

Or as I like to call it cldr-data. We just pulled down Globalize 1.x but we didn't pull down the data that Globalize 1.x relies upon. This is one of the differences between Globalize 0.1.x and 1.x. Globalize 1.x does not include the "culture" data. By which I mean all the globalize.culture.de-DE.js type files. Instead Globalize 1.x relies upon CLDR - Unicode Common Locale Data Repository. It does this in the form of cldr-json.

Now before you start to worry, you shouldn't actually need to go and get this by yourself, the lovely Rafael Xavier de Souza has saved you a job by putting together Bower and npm modules to do the hard work for you.

I'm using Bower for my client side package management and so I'll use that. Looking at the Bower dependencies downloaded when I upgraded my package I can see there is a cldr-data package. Yay! However it appears to be missing the associated json files. Boo!

To the documentation Batman. It says you need a .bowerrc file in your repo which contains this:

{
  "scripts": {
    "preinstall": "npm install [email protected]",
    "postinstall": "node ./node_modules/cldr-data-downloader/bin/download.js -i bower_components/cldr-data/index.json -o bower_components/cldr-data/"
  }
}

Unfortunately, because I've already upgraded to v1 adding this file alone doesn't do anything for me. To get round that I delete my bower_components folder and enter bower install. Boom!

bower globalize#^1.0.0          cached git://github.com/jquery/globalize.git#1.0.0
bower globalize#^1.0.0        validate 1.0.0 against git://github.com/jquery/globalize.git#^1.0.0
bower cldrjs#0.4.1                        cached git://github.com/rxaviers/cldrjs.git#0.4.1
bower cldrjs#0.4.1                      validate 0.4.1 against git://github.com/rxaviers/cldrjs.git#0.4.1
bower cldr-data#>=25                      cached git://github.com/rxaviers/cldr-data-bower.git#27.0.3
bower cldr-data#>=25                    validate 27.0.3 against git://github.com/rxaviers/cldr-data-bower.git#>=25
bower                                 preinstall npm install [email protected]
bower                                 preinstall [email protected] node_modules\cldr-data-downloader
bower                                 preinstall ├── [email protected]
bower                                 preinstall ├── [email protected]
bower                                 preinstall ├── [email protected] ([email protected])
bower                                 preinstall ├── [email protected] ([email protected])
bower                                 preinstall ├── [email protected] ([email protected])
bower                                 preinstall ├── [email protected]
bower                                 preinstall ├── [email protected] ([email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected])
bower                                 preinstall └── [email protected] ([email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected])
bower globalize#^1.0.0                   install globalize#1.0.0
bower cldrjs#0.4.1                       install cldrjs#0.4.1
bower cldr-data#>=25                     install cldr-data#27.0.3
bower                                postinstall node ./node_modules/cldr-data-downloader/bin/download.js -i bower_components/cldr-data/index.json -o bower_components/cldr-data/
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-core/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-dates-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-buddhist-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-chinese-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-coptic-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-dangi-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-ethiopic-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-hebrew-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-indian-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-islamic-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-japanese-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-persian-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-cal-roc-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-localenames-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-misc-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-numbers-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-segments-modern/archive/27.0.3.zip`
bower                                postinstall GET `https://github.com/unicode-cldr/cldr-units-modern/archive/27.0.3.zip`
bower                                postinstall Received 28728K total.
bower                                postinstall Received 28753K total.
bower                                postinstall Unpacking it into `./bower_components\cldr-data`

globalize#1.0.0 bower_components\globalize
├── cldr-data#27.0.3
└── cldrjs#0.4.1

cldrjs#0.4.1 bower_components\cldrjs
└── cldr-data#27.0.3

cldr-data#27.0.3 bower_components\cldr-data

That's right - I'm golden. And if I didn't want to do that I could have gone straight to the command line and entered this: (look familiar?)

npm install [email protected]
node ./node_modules/cldr-data-downloader/bin/download.js -i bower_components/cldr-data/index.json -o bower_components/cldr-data/

Some bitching and moaning.

If, like me, you were a regular user of Globalize 0.1.x then you know that you needed very little to get going. As you can see from our example you just serve up Globalize.js and the culture files you are interested in (eg globalize.culture.de-DE.js). That's it - you have all you need; job's a good'un. This is all very convenient and entirely lovely.

Globalize 1.x has a different approach and one that (I have to be honest) I'm not entirely on board with. The thing that you need to know about the new Globalize is that nothing comes for free. It's been completely modularised and you have to include extra libraries depending on the functionality you require. On top of that you then have to work out the portions of the cldr data that you require for those modules and supply them. This means that getting up and running with Globalize 1.x is much harder. Frankly I think it's a little painful.

I realise this is a little "Who moved my cheese". I'll get over it. I do actually see the logic of this. It is certainly good that the culture date is not frozen in aspic but will evolve as the world does. But it's undeniable that in our brave new world Globalize is no longer a doddle to pick up. Or at least right now.

Take the modules and run

So. What do we actually need? Well I've consulted the documentation and I think I'm clear. Our simple demo cares about dates and numbers. So I'm going to guess that means I need:

On top of that I'm also going to need the various cldr dependencies too. That's not all. Given that I've decided which modules I will use I now need to acquire the associated cldr data. According to the docs here we're going to need:

  • cldr/supplemental/likelySubtags.json
  • cldr/main/locale/ca-gregorian.json
  • cldr/main/locale/timeZoneNames.json
  • cldr/supplemental/timeData.json
  • cldr/supplemental/weekData.json
  • cldr/main/locale/numbers.json
  • cldr/supplemental/numberingSystems.json

Figuring that all out felt like really hard work. But I think that now we're ready to do the actual migration.

Update 30/08/2015: Globalize · So What'cha Want

To make working out what you need when using Globalize I've built Globalize · So What'cha Want. You're so very welcome.

The Actual Migration

To do this I'm going to lean heavily upon an example put together by Rafael. The migrated code looks like this:

<html>
  <head>
    <title>Globalize demo</title>
    <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.5/css/bootstrap.min.css" rel="stylesheet">
  </head>
  <body>
    <div class="container-fluid">
      <h4>Globalize demo for the <em id="locale"></em> culture / locale</h4>
      <p>This is a the number <strong id="number"></strong> formatted by Globalize: 
          <strong id="numberFormatted"></strong></p>
      <p>This is a the number <strong id="date"></strong> formatted by Globalize: 
          <strong id="dateFormatted"></strong></p>
    </div>

    <!-- First, we load Globalize's dependencies (`cldrjs` and its supplemental module). -->
    <script src="bower_components/cldrjs/dist/cldr.js"></script>
    <script src="bower_components/cldrjs/dist/cldr/event.js"></script>
    <script src="bower_components/cldrjs/dist/cldr/supplemental.js"></script>

    <!-- Next, we load Globalize and its modules. -->
    <script src="bower_components/globalize/dist/globalize.js"></script>
    <script src="bower_components/globalize/dist/globalize/number.js"></script>

    <!-- Load after globalize/number.js -->
    <script src="bower_components/globalize/dist/globalize/date.js"></script>

    <script>

        var locale = 'de';

        Promise.all([
          // Core
          fetch('bower_components/cldr-data/supplemental/likelySubtags.json'),

          // Date
          fetch('bower_components/cldr-data/main/' + locale + '/ca-gregorian.json'),
          fetch('bower_components/cldr-data/main/' + locale + '/timeZoneNames.json'),
          fetch('bower_components/cldr-data/supplemental/timeData.json'),
          fetch('bower_components/cldr-data/supplemental/weekData.json'),

          // Number
          fetch('bower_components/cldr-data/main/' + locale + '/numbers.json'),
          fetch('bower_components/cldr-data/supplemental/numberingSystems.json'),
        ])
        .then(function(responses) {
            return Promise.all(responses.map(function(response) {
                return response.json();
            }));
        })
        .then(Globalize.load)
        .then(function() {
            var number = 12345.67;
            var date = new Date(2012, 5, 15);

            var globalize = Globalize( locale );
            document.querySelector('#locale').innerText = locale;
            document.querySelector('#number').innerText = number;
            document.querySelector('#date').innerText = date;
            document.querySelector('#numberFormatted').innerText = globalize.formatNumber(number, {
              minimumFractionDigits: 2, 
              useGrouping: true 
            });
            document.querySelector('#dateFormatted').innerText = globalize.formatDate(date, { 
              date: 'medium' 
            });
        })
    </script>
  </body>
</html>

By the way, I'm using fetch and promises to load the cldr-data. This isn't mandatory - I use it because Chrome lets me. (I'm so bleeding edge.) Some standard jQuery ajax calls would do just as well. There's an example of that approach here.

Observations

We've gone from not a lot of code to... well, more than a little. A medium amount. Almost all of that extra code relates to getting Globalize 1.x to spin up so it's ready to work. We've also gone from 2 HTTP requests to 13 which is unlucky for you. 6 of them took place via ajax after the page had loaded. It's worth noting that we're not even loading all of Globalize either. On top of that there's the old order-of-loading shenanigans to deal with. All of these can be mitigated by introducing a custom build step of your own to concatenate and minify the associated cldr / Globalize files.

Loading the data via ajax isn't mandatory by the way. If you wanted to you could create your own style of globalize.culture.de.js files which would allow you load the page without recourse to post-page load HTTP requests. Something like this Gulp task I've knocked up would do the trick:


gulp.task("make-globalize-culture-de-js", function() {
  var locale = 'de';
  var jsonWeNeed = [
    require('./bower_components/cldr-data/supplemental/likelySubtags.json'),
    require('./bower_components/cldr-data/main/' + locale + '/ca-gregorian.json'),
    require('./bower_components/cldr-data/main/' + locale + '/timeZoneNames.json'),
    require('./bower_components/cldr-data/supplemental/timeData.json'),
    require('./bower_components/cldr-data/supplemental/weekData.json'),
    require('./bower_components/cldr-data/main/' + locale + '/numbers.json'),
    require('./bower_components/cldr-data/supplemental/numberingSystems.json')
  ];

  var jsonStringWithLoad = 'Globalize.load(' + 
    jsonWeNeed.map(function(json){ return JSON.stringify(json); }).join(', ') + 
    ');';

  var fs = require('fs');
  fs.writeFile('./globalize.culture.' + locale + '.js', jsonStringWithLoad, function(err) {
      if(err) {
          console.log(err);
      } else {
          console.log("The file was created!");
      }
  });
})

The above is standard node/io type code by the way; just take the contents of the function and run in node and you should be fine. If you do use this approach then you're very much back to the simplicity of Globalize 0.1.x's approach.

And here is the page in all its post migration glory:

It looks exactly the same except 'de-DE' has become simply 'de' (since that's how the cldr rolls).

The migrated code is there for the taking. Make sure you remember to bower install - and you'll need to host the demo on a simple server since it makes ajax calls.

Before I finish off I wanted to say "well done!" to all the people who have worked on Globalize. It's an important project and I do apologise for my being a little critical of it here. I should say that I think it's just the getting started that's hard. Once you get over that hurdle you'll be fine. Hopefully this post will help people do just that. Pip, pip!