Skip to main content

42 posts tagged with "typescript"

View All Tags

Create React App with ts-loader and CRACO

Create React App is a fantastic way to get up and running building a web app with React. It also supports using TypeScript with React. Simply entering the following:

npx create-react-app my-app --template typescript

Will give you a great TypeScript React project to get building with. There's two parts to the TypeScript support that exist:

  1. Transpilation AKA "turning our TypeScript into JavaScript". Back since Babel 7 launched, Babel has enjoyed great support for transpiling TypeScript into JavaScript. Create React App leverages this; using the Babel webpack loader, babel-loader, for transpilation.
  2. Type checking AKA "seeing if our code compiles". Create React App uses the fork-ts-checker-webpack-plugin to run the TypeScript type checker on a separate process and report any issues that may exist.

This is a great setup and works very well for the majority of use cases. However, what if we'd like to tweak this setup? What if we'd like to swap out babel-loader for ts-loader for compilation purposes? Can we do that?

Yes you can! And that's what we're going to do using a tool named CRACO - the pithy shortening of "Create React App Configuration Override". This is a tool that allows us to:

Get all the benefits of create-react-app and customization without using 'eject' by adding a single craco.config.js file at the root of your application and customize your eslint, babel, postcss configurations and many more.

babel-loader ts-loader#

So let's do the swap. First of all we're going to need to add CRACO and ts-loader to our project:

npm install @craco/craco ts-loader --save-dev

Then we'll swap over our various scripts in our package.json to use CRACO:

"start": "craco start",
"build": "craco build",
"test": "craco test",

Finally we'll add a craco.config.js file to the root of our project. This is where we swap out babel-loader for ts-loader:

const { addAfterLoader, removeLoaders, loaderByName, getLoaders, throwUnexpectedConfigError } = require('@craco/craco');
const throwError = (message) =>
throwUnexpectedConfigError({
packageName: 'craco',
githubRepo: 'gsoft-inc/craco',
message,
githubIssueQuery: 'webpack',
});
module.exports = {
webpack: {
configure: (webpackConfig, { paths }) => {
const { hasFoundAny, matches } = getLoaders(webpackConfig, loaderByName('babel-loader'));
if (!hasFoundAny) throwError('failed to find babel-loader');
console.log('removing babel-loader');
const { hasRemovedAny, removedCount } = removeLoaders(webpackConfig, loaderByName('babel-loader'));
if (!hasRemovedAny) throwError('no babel-loader to remove');
if (removedCount !== 2) throwError('had expected to remove 2 babel loader instances');
console.log('adding ts-loader');
const tsLoader = {
test: /\.(js|mjs|jsx|ts|tsx)$/,
include: paths.appSrc,
loader: require.resolve('ts-loader'),
options: { transpileOnly: true },
};
const { isAdded: tsLoaderIsAdded } = addAfterLoader(webpackConfig, loaderByName('url-loader'), tsLoader);
if (!tsLoaderIsAdded) throwError('failed to add ts-loader');
console.log('added ts-loader');
console.log('adding non-application JS babel-loader back');
const { isAdded: babelLoaderIsAdded } = addAfterLoader(
webpackConfig,
loaderByName('ts-loader'),
matches[1].loader // babel-loader
);
if (!babelLoaderIsAdded) throwError('failed to add back babel-loader for non-application JS');
console.log('added non-application JS babel-loader back');
return webpackConfig;
},
},
};

So what's happening here? The script looks for babel-loader usages in the default Create React App config. There will be two; one for TypeScript / JavaScript application code (we want to replace this) and one for non application JavaScript code. I'm actually not too clear what non application JavaScript code there is or can be, but we'll leave it in place; it may be important.

You cannot remove a single loader using CRACO, so instead we'll remove both and we'll add back the non application JavaScript babel-loader. We'll also add ts-loader with the transpileOnly: true option set (to ensure ts-loader doesn't do type checking).

Now the next time we run npm start we'll have Create React App running using ts-loader and without having ejected. If we want to adjust the options of ts-loader further then we're completely at liberty to do so, adjusting the options in our craco.config.js.

If you value debugging your original source code rather than the transpiled JavaScript, remember to set the "sourceMap": true property in your tsconfig.json.

Finally, if we wanted to go even further, we could remove the fork-ts-checker-webpack-plugin and move ts-loader to use transpileOnly: false so it performs type checking also. However, generally it may be better to stay with the setup with post outlines for performance reasons.

Definitely Typed: The Movie

I'd like to tell you a story. It's the tale of the ecosystem that grew up around a language: TypeScript. TypeScript is, for want of a better description, JavaScript after a trip to Saville Row. Essentially the same language, but a little more together, a little less wild west. JS with a decent haircut and a new suit. These days, the world seems to be written in TypeScript. And when you pause to consider just how young the language is, well, that's kind of amazing.

Who could have predicted it would end up like this? When I was a boy I remember coming down the stairs in my childhood home. Shuffling to the edge of each step on my bottom before thumping down to the one beneath. When I look at those same stairs now they're so small. I barely notice the difference between one step and the next. But back then each step seemed giant, each one so far apart. Definitely Typed had any number of steps in its evolution. They all seemed so significant then; whereas now they're just a memory. Let's remember together…

Prolog(ue)#

When it was first unveiled to the world by Anders Hejlsberg back in 2012, there was nothing to suggest TypeScript was going to be seismic in its effects. The language brought two important things to the table. First of all, the ability to write JavaScript with optional static typing (imagine this as "belts and braces" for JS). The second feature was interoperability with existing JavaScript.

The reason TypeScript has the traction that it does, is a consequence of the latter feature. The JavaScript ecosystem was already a roaring success by 2012. Many useful libraries were out there, authored in vanilla JavaScript. jQuery, Backbone, Knockout were all going concerns. People were building things.

Wisely, having TypeScript able to work with existing JavaScript libraries was a goal of the language right from the off. This made sense; otherwise it would have been like unveiling Netflix to the world whilst saying "sorry you can't use a television set to watch this". Remember, JS was great as is - people wanted static typing so they could be more productive and so they could sleep better at night. ("Oh wait, did I write that unit test to check all the properties? Dammit, it's 3am!") If TypeScript had hove onto the scene requiring that everything was written in TypeScript then I would not be writing this. It didn't.

Interoperability was made possible by the concept of "type definitions". Analogous to header files in C, these are TypeScript files with a .d.ts suffix that tell the compiler about an existing JavaScript library which is in scope. This means you can write TypeScript and use jQuery or [insert your favourite library name here]. Even though they are not written in TypeScript.

At the time of the initial TypeScript announcement (v0.8.1) there was no concept of a repository of type definitions. I mean, there was every chance that TypeScript wasn't going to be a big deal. Success wasn't guaranteed. But it happened. You're reading this in a world where Definitely Typed is one of the most popular repos on GitHub and where type definitions from it are published out to npm for consumption by developers greedy for static types. A world where the TypeScript team has pretty much achieved its goal of "types on every desk".

I want to tell you the story of the history of type definitions in the TypeScript world. I'm pretty well placed to do this since I've been involved since the early days. Others involved have been kind enough to give me their time and tell me their stories. There's likely to be errors and omissions, and that's on me. It's an amazing tale though; I'm fortunate to get to tell it.

The First Type Definition#

I was hanging out for something like TypeScript. I'd been busily developing rich client applications in JS and, whilst I loved the language, I was dearly missing static typing. All the things broke all of the time and I wanted help. I wanted a compiler to take me by the hand and say "hey John, you just did a silly thing. Don't do it John; you'll only be filled with regret...". The TypeScript team wrote that compiler.

When TypeScript was announced, it was important that the world could see that interop with JS was a first class citizen. Accordingly, a jQuery type definition was demonstrated as well. At the time, jQuery was the number one JavaScript library downloaded on the internet. So naturally it was the obvious choice for a demo. The type definition was fairly rough and ready but it worked. You can see Anders Hejlsberg showing off the jQuery definition 45 minutes into this presentation introducing TypeScript.

Consumption was straightforward, if perhaps quirky. You took the jquery.d.ts file, copied it into your project location. Back then, to let the compiler know that a JS library had come to the party you had to use a kind of comment pragma in the header of your TypeScript files. For example: /// <reference path="jquery/jquery.d.ts" />. This let TypeScript know that the type definition living at that path was relevant for the current script and it should scope it in.

There was no discussion of “how do we type the world”? Even if they wanted to, the TypeScript team didn't really have the resources at that point to support this. They'd got as far as they had on the person power of four or five developers and some testers as well. There was a problem clearly waiting to be solved. As luck would have it, in Bulgaria a man named Boris Yankov had been watching the TypeScript announcement.

Boris Yankov#

Boris Yankov was a handsome thirty year old man, living in the historic Bulgarian city of Plovdiv. He was swarthy with dark hair; like Ben Affleck if had been hanging out in Eastern Europe for a couple of years.

Boris was a backend developer who'd found himself doing more and more frontend. More JavaScript. He was accustomed to C# on the backend with static typing a-gogo. From his point of view JS was brittle. It was super easy to break things and have no idea until runtime that you'd done so. It seemed so backward. He was ready for something TypeScript shaped.

"What people forget is how different it was back then. Microsoft made this announcement, but probably most of the people that were listening were part of the MS ecosystem. I certainly was. Remember, back then if you had a Mac or did Linux you probably didn't think about MS too much."

Boris thought TypeScript just seemed like this interesting and weird thing that Microsoft were doing. He was excited by types; he was missing them and there was a real need there. A problem to solve. There were already people trying to address this. But the attempts so far had been underwhelming. Boris had encountered Google Closure Compiler; a tool built by Google which, amongst other things, introduces some measure of type safety to JavaScript by reading annotations in JSDoc format. Boris viewed GCC as a tentative first step. One which lead the way for things like TypeScript and Flow to follow.

The other aspect of TypeScript that excited Boris was transpilation. Transpilation is the term coined to describe what TypeScript does when it comes to emit output. It takes in TypeScript and pumps out JavaScript. The question is: what sort of JavaScript? One choice the TypeScript team could have made was just having the compiler stripping out types from the codebase. If it worked that way then you'd get out the JavaScript equivalent of the TypeScript you wrote. You wrote a class? TypeScript emits a class; just; one shorn of types and interfaces.

The TypeScript team made a different choice. They wrote the compiler such that the user could write ES6 style TypeScript syntax and have the TypeScript compiler transpile that down to ES5 or even ES3 syntax. This made TypeScript a much more interesting proposition than it already was, for a couple of reasons.

ES6 had been in the works for some time at this point. The release was shaping up to be the biggest incremental change to JavaScript that had so far happened. Or that would ever happen. Prior to this, JavaScript had experienced no small amount of tension and disagreement as it sought to evolve and develop. These played out in the form of the abandoned fourth edition of the language. There were arguments, harsh words, public disagreements and finally a failure to ship ECMAScript 4. In an alternate universe this was the end of the road for JavaScript. However, in our universe JavaScript got another throw of the dice.

It's telling that ES5 was for a long time known also as ES3.1; reflecting that it was initially planned to be the stepping stone between ES3 and ES4. In reality it ended up being the stepping stone between ES3 and ES6. As it turned out, it was a vital one too, it allowed the TC39 to recalibrate after a very public shelving of plans.

The band was back together (albeit with a new rhythm section) and ES6 was going to be massive. JavaScript was going to get new constructs such as Map, Set, new scoping possibilities with let and const, Promises which paved the way for new kinds of async programming, the contentious classes…. And who can forget where they were when they first heard about "fat" arrow functions?

People salivated at the idea of it all. Such new shiny toys! But how could we use them? Whilst all this new hotness was on the way, where could you actually run your new style code? Complete browser implementations of ES6 wouldn't start to materialise until 2018. Given the slowness of people to upgrade and the need to support the lowest common denominator of browser this could have meant that all the excitement was trapped in a never tomorrow situation.

Back to TypeScript. The team had a solution for this issue. In their wisdom, the TypeScript team allowed us to write ES6 TypeScript and the compiler could (with some limitations) transpile it down to ES3 JavaScript. The audacity of this was immense. The TypeScript team brought the future back to the past. What's more, they made it work in Internet Explorer 6. Now that's rock'n'roll. It's nothing short of miraculous!

The significance of transpilation to TypeScript cannot be overstated.

You might be thinking to yourself, "that's just Babel, right?" Right. It's just that Babel didn't exist then. 6to5 was still an idea waiting for Sebastian McKenzie to think of. Even if you were kind of "meh" on types, the attraction of using a tool which allowed you to use new JavaScript constructs without breaking your customers was a significant draw. People may have come for types, but once they'd experienced the joy of a lexically bound this in a fat arrow function they were never going back.

Success has many parents. TypeScript is a successful project. One reason for this is that it's an excellent product that fills a definite need. Another reason is one that can't be banked upon; timing. TypeScript has enjoyed phenomenal timing. Appearing just when JavaScript was going off like a rocket and having the twin benefits of types and future JS today when nothing else offered anything close, that's perfect timing. It got people's curiosity. Now it got Boris's attention.

Definitely Typed#

Boris had been feeling unproductive. He would build applications in JS and watch them unaccountably break as he made simple tweaks to them. He was constantly changing things, breaking them, fixing them and hoping he hadn't broken something else along the way. It was exhausting. He saw the promise in what TypeScript was offering and decided to give it a go.

It was great. He fired up Visual Studio and converted a .js file to end with the mystical TypeScript suffix of .ts. In front of his eyes, red squiggly lines started to appear here and there in his code. As he looked at the visual noise he could see this was TypeScript delivering on its promise. It was finding the bugs he hadn't spotted. These migrations were also addictive; the more information you could feed the compiler, the more problems it found. Boris felt it was time to start writing type definitions, whatever they were.

Boris quickly learned how to write a type definition and set to work. Most libraries weren't well documented and so he found himself reading the source code of libraries he used in order that he could write the definitions. At first, the definitions were just files dropped in his ASP.NET MVC projects that he copied around. That wasn't going to scale; there needed to be somewhere he could go to grab type definitions when he needed them. And so on October 5th 2012 he created a repository under his profile at GitHub called "DefinitelyTyped": https://github.com/DefinitelyTyped/DefinitelyTyped/commit/647369a322be470d84f8d226e297267a7d1a0796

Boris took his type definitions and put them into this repository. Were you ever curious what the first definition added was? Close your eyes and think... You might imagine it was the (then number one JavaScript library on the web) jQuery. In fact it was Modernizr. Then Underscore followed, and then jQuery. Take a look:

https://github.com/DefinitelyTyped/DefinitelyTyped/commits?after=4a4cf23ff4301835a45bb138bbb62bf5f0759255+699&author=borisyankov

It wasn't complicated; it was just a folder with subfolders underneath; each folder representing a project. One for jQuery, one for jQuery UI, one for Knockout.... You get the idea. It's not so different now.

Boris had laid simple but dependable foundations. Definitely Typed had been born.

How Do You Test a Type Definition?#

Boris was careful too. Right from the first type definition he added tests alongside them. Now tests for a type definition were a conundrum. How do you write a test for interfaces that don't exist in the runtime environment? Code that is expunged as part of the compilation process. Well, the answer Boris came to was this: a compilation test.

Someone once said: compilation is the first unit test... But it's a doozy. They're right. The value you get from compilation, from a computer checking the assertions your code makes, is significant. Simply put, it takes a large amount of tests to get the same level of developer confidence. Computers are wonderful at attention to detail in a way that puts even the most anally retentive human being to shame.

So if Boris had written a definition called mylib.d.ts, he'd write a file that exercises this type definition. A mylib.tests.ts if you will. This file would contain code that exercises the type definition in the way that it should correctly be used. This is code that will never be executed in the way that tests normally are; a test program is never actually run. Rather these tests exist solely for compilation time. (In much the same way that TypeScript types only exist for compilation time.) Boris's plan was this: no compilation errors in mylib.tests.ts represents passing tests. Compilation errors in mylib.tests.ts represents failing tests. It was functional, brutal and also beautiful in it's simplicity.

So, imagine your definition looked like this:

declare function turnANumberIntoAString (numberToMakeStringOutOf: number): string

You might write a compilation test that looks like this:

const itIsAString: string = turnANumberIntoAString(42);

This test ensures that you can use your function in the way you'd expect. It returns the types you'd desire (a string in this case) and it accepts the parameters you'd expect (a single number for this example). If someone changed the definition in future, such that a different type was returned or a different set of parameters was required it would break the test. The test code wouldn't compile anymore. That's the nature of our "test". It's blunt but effective.

This is the very first test committed to Definitely Typed; a test for Modernizr.

https://github.com/DefinitelyTyped/DefinitelyTyped/blob/a976315cacbcfc151d5f57b25f4325dc7deca6f2/Tests/modernizr.ts

This idea represents what tests look like throughout Definitely Typed today. They're now "run" as part of Continuous Integration and type definitions are tested in concert with one another to ensure that a change to one type definition doesn't break another. But there is nothing fundamentally different in place today to what Boris originally came up with.

Independence#

Very quickly, Definitely Typed became a known project. People like Steve Fenton (author of the first book about TypeScript) were vocal supporters of the project. The TypeScript team talked up the project and were entirely supportive of its existence. In fact, at every given opportunity Anders Hejlsberg would sing its praises. For a while you could guarantee that any TypeScript talk by Anders would include a variant of "this guy called Boris started a project called Definitely Typed". The impression he gave was that he was kind of amazed, and thoroughly delighted, the project existed.

The TypeScript team were completely uninvolved with Definitely Typed. That in itself is worth considering. The perception of Microsoft by developers generally in 2012 was at best, highly suspicious. "Embrace, extend, extinguish" - a strategy attributed to MS was very much a current perspective. This was born out in online comments and conversations at meetups. The Hacker News comments on the TypeScript release were a mixed bag. The reaction on social media was rather less generous. Certainly it was harsh enough to prompt Scott Hanselman to write something of a defence of TypeScripts right to exist.

Given that TypeScript had arrived with the promise of transforming the JavaScript developer experience, the developer community was understandably cautious. Was Microsoft doing a good or ill? Could they be trusted? There were already signs that MS was changing. For example, it had been shipping open source libraries such as jQuery with ASP.Net MVC for some time. Microsoft was starting to engage with the world of open source software.

How Microsoft interacted with the (very open source driven) JS community was going to be key to the success (or not) of TypeScript. What happened with the establishment of Definitely Typed very much indicated TypeScripts direction of travel.

On day one of its existence, Boris took type definitions written by Microsoft and made them available via Definitely Typed. A ballsy move. It would have been completely possible for MS to object to this. They didn't.

People like Diullei Gomes started submitting pull requests to improve the existing definitions and add new ones. Diullei even wrote the first command line tooling which allowed people to install type definitions: TSD. Within a surprisingly short period, DT had become the default home of type definitions on the web. There were briefly alternative Definitely Typed styled collections of type definitions elsewhere on GitHub but they didn't last.

This all happened completely independently of the TypeScript team. Definitely Typed existing actually allowed TypeScript itself to prosper. It was worth persevering with this bleeding edge language because of the interoperability Definitely Typed was providing to the community. So the hands off attitude of MS was both surprising and encouraging. It showed trust of the community; something that hadn't hitherto been a commonly noted characteristic of MS.

Boris started adding contributors to Definitely Typed to help him with the work. Definitely Typed was no longer a one man band, it had taken an important step. It was built and maintained by an increasing number of creative and generous people. All motivated by a simple aim: the best developer experience when working with TypeScript and existing JS libraries.

Basarat Ali Syed#

Basarat Ali Syed was a 27 year old who had recently moved to Melbourne, Australia from Pakistan. You might know of him for a number of reasons, not least being the TypeScript equivalent of Jon Skeet. That, incidentally, is not a coincidence. Basarat had watched Jon Skeet's impressive work, being the gold standard in C# answers and thought "there's something worth emulating here".

Bas was working for a startup who had a JS frontend. About six months before TypeScript was announced to the world he watched Anders Hejlsberg do a presentation on JavaScript which included Anders saying to the audience "don't you just wish you had type safety?" with a twinkle in his eye. TypeScript was of course well underway by this time; just not yet public. Bas remembered the comment and, when TypeScript was announced, he was ready. He made it his personal mission to be the goto person answering questions about TypeScript on Stack Overflow.

In those early days of TypeScript, if you put a question about TypeScript onto Stack Overflow there was a very good chance that Bas would answer it. And Bas was more helpful than your typical SO answerer. Not only would he provide helpful commentary and useful guidance, he would often find him answering "yeah, the problem isn't your code, it's the type definition. It needs improvement. In fact, I've raised a PR to fix it here…"

Boris saw the drip, drip of Basarat PRs turning into a flood. So, very quickly, he invited Basarat join Definitely Typed. Now Bas could not just suggest changes, he could ensure they were made. Step by step the quality of type definitions improved.

Basarat describes himself as a "serial OSS contributor and mover on-er". It's certainly true. As well as his Stack Overflow work, he's been someone involved in the early days of any number of open source projects. Not just Definitely Typed. Bas also worked on the TypeScript port of the JavaScript task runner; Grunt TS. He met up with Pete Hunt (he of React) at a Decompress conference and together they hacked together a POC webpack TypeScript loader. (That POC ultimately lead to James Brantly creating ts-loader which I maintain.) Bas wrote the atom-typescript plugin which offers first class support for TypeScript in Atom. Not content with that he went on to write a full blown editor of his own called alm-tools.

This is not an exhaustive list of his achievements and already I'm tired. Besides this he wrote the TypeScript Deep Dive book and the VS Code TypeScript God extension. And more.

Bas had the level of self knowledge required to realise that getting others involved was key to the success of open source projects. Particularly given that he knew he had a predilection to eventually move on, to work on other things. So Bas kept his eyes open and welcomed in new maintainers for projects he was working on. Bas' actions in particular were to be crucial. Bas grew the Definitely Typed team; he invited others in, he got people involved.

On December 28th 2013 Basarat decided that a regular contributor to Definitely Typed might be a potential team member. Bas opened up Twitter and sent a Direct Message to John Reilly.

John Reilly#

That's me. Or johnny_reilly on Twitter and johnnyreilly on GitHub (as John Papa and I have learned to our chagrin; GitHub don't support the "_" character in usernames). Relatively few people call me Johnny. I'm named that online because back when I applied for an email address, someone had already bagsied john_[email protected] So rather than sully my handle with a number or a middle name I settled for johnny_reilly. I haven't looked back and have generally tried to keep that nom de plume wherever I lay my hat online.

In contrast to others I was a relatively late starter to TypeScript. I was intrigued right from the initial announcement, but held off from properly getting my hands dirty until generics was added to the language in 0.9. (This predisposition towards generics in a language perhaps explains why I didn't get too far with Golang.)

At that point I was working in London for a private equity house. It was based in the historic and affluent area of St James. St James is an interesting part of London, caught midway between the Government, Buckingham Palace and the heart of the West End. It's old fashioned, dripping with money and physically delightful. It's the sort of place film crews dash towards when they're called upon to show old fashioned London in all its pomp. It rocks.

My team hated JavaScript. Absolutely loathed it. I was the solo voice saying "but it's really cool!" whilst they all but burned effigies of Brendan Eich in each code review. However, to my delight (and their abject horror) the project we were working on could only be implemented using JS. Essentially the house wanted an application offering rich interactivity which had to be a web app. So… JS. We were coding then with a combination of jQuery and Knockout JS. And, in large part due to the majority of the team being unfamiliar with JS, we were shipping bugs. The kind of bugs that could be caught by a compiler. By static typing. Not to put too fine a point on it; by TypeScript.

So I proposed an experiment: "Let's take one screen and develop it with TypeScript. Let's leave the rest of the app as is; JavaScript as usual. And then once we're done with that screen let's see how we feel about it. TypeScript might not be that great. But that's fine, if it isn't we'll take the generated JS, keep that and throw away the TypeScript. Deal?"

The team were on board and, one sprint review later, we decided that all future JS functionality would be implemented with TypeScript. We were in!

From day one of using TypeScript I was in love. I had the functionality of JavaScript, the future semantics of JavaScript and I was making less mistakes. Our team had become more productive. We were shipping faster and more reliably with fewer errors. People were noticing; our reputation as a team was improving, in part due to our usage of TypeScript. We had a jetpack.

However. I wasn't satisfied. As I tapped away at my keyboard I found type definitions to be… imperfect. And that niggled. Did it ever niggle. By then Jason Jarrett had wired up Definitely Typed packages to be published out to Nuget. Devs using ASP.NET MVC 4 (as I then was) were busily installing type definitions alongside AutoFac and other dependencies. Whilst most of those dependencies arrived like polished diamonds, finished products ready to be plugged into the project and start adding value. The type definitions by contrast felt very beta. And of course, they were. TypeScript was beta. The definitions reflected the newness of the language.

I could make it better.

I started submitting pull requests. The first problem I decided to solve was IntelliSense. I wanted IntelliSense for jQuery. If you went to https://api.jquery.com there was rich documentation for every method jQuery exposed. I wanted to see that documentation inside Visual Studio as I coded. If I keyed in $.appendTo( I wanted VS to be filled with the content from https://api.jquery.com/appendTo/ . That was my mission. For each overload of the method I'd add something akin to this to the type definition file:

/**
* Insert every element in the set of matched elements to the end of the target.
*
* @param value A selector, element, HTML string, array of elements, or jQuery
* object; the matched set of elements will be inserted at the end
* of the element(s) specified by this parameter.
*/
appendTo(target: string): JQuery;

It was a tedious task plugging it all in, but the pleasure I got from having rich IntelliSense in VS more than made up for it to me. Along the way I added and fixed sections of the jQuery API that hadn't been implemented, or had been implemented incorrectly. It got to a point where jQuery was a good example of what a type definition should look like. That remains the case to this day; surprisingly few type definitions enjoy the JSDoc richness of jQuery. I have tried to encourage more use of this with blog posts code reviews and the like, but it's never got the traction I'd hoped.

I'm fairly relentless when I put my mind to something. I work very hard to make things come to pass. What this meant at one point was the Definitely Typed maintainers receiving multiple PRs a day. Which prompted Bas to wonder "I wonder if he'd like to join us?"

I happily accepted Bas' invitation and soon found myself reading this email:

From: Bas

Sent: 28 December 2013 11:47

To: Boris Yankov; johnny_[email protected]; Bas; vvakame; Bart van der Schoor; Diullei Gomes; steve fenton; Jason Jarret Subject: DefinitelyTyped team introduction

Dear All,

Meet John Reilly (github : https://github.com/johnnyreilly , twitter : https://twitter.com/johnny\_reilly) who will be helping with Definitely Typed definitions.

Boris manages the project and he can add you as a collaborator.

Additional team member introductions:

Admin : Boris Yankov

TSD package manager : https://github.com/DefinitelyTyped/tsd : Diullei / Bart van der Schoor

NUGET: https://github.com/DefinitelyTyped/NugetAutomation : Json Jarret

Passionate TypeScript users like yourself: Wakame, Myself and SteveFenton .

Cheers, Bas (Basarat)

Some of those names you'll recognise; some perhaps not. Jason Jarrett wrote the Nuget distribution mechanism for type definitions that ended up existing for far longer than anyone (least of all Jason) anticipated. Steve Fenton was largely a cheerleader for Definitely Typed in its early days. Diullei and Bart, amongst other things, worked on the initial command line tooling for DT: TSD.

After being powered up in Definitely Typed, my contributions only increased. Anything that I was using in my day to day work, I wanted to have an amazing TypeScript experience. I wanted the language to thrive and I was pretty sure I could help by trying to get users the best-in-class developer experience as they used JS libraries. I've always found good developer experience a strong motivation; the idea being, if someone loves their tools, they'll do great work. The end customer (of whatever they're building) gets a better product sooner. Great developer experience is a force multiplier for building software.

Policy time#

TypeScript was now at version 0.9.1. Still very much beta. Back then every release was breaking. Breaking. Very much with a capital "B".

TypeScript had, since the very early days, made a commitment to track the ECMAScript standard. All JavaScript is valid TypeScript. However, there was briefly a period where this might not have been so. One of the things people most remember from the initial release is that they could now write classes. These were already the standardised classes of ES6 but it almost wasn't to be. For a brief period there had been consideration of doing something subtly different. In fact Anders would describe the TypeScript team's journey towards embracing the standards as a tale tinged with regret. In doing so they'd had to say goodbye to a different implementation of classes which he'd preferred but which they'd ditched because they weren't standard.

Alongside differences like this there were other delineations. Types had different names in the past which, as time went by, were renamed to align with standards. boolean was originally bool for instance; likely a reflection of Anders involvement with C#.

These sorts of changes, alongside any number of others, meant that each release of TypeScript sometimes entirely broke the definitions in Definitely Typed. Most notable was the 0.9.1 -> 0.9.5 migration. This was both an exercise in serious pain endurance and also a testament to the already strong commitment to TypeScript that existed. The reason people were willing to put the effort in to keep these migrations going was because they believed it was worth it. They believed in TypeScript. This PR is testament to that: https://github.com/DefinitelyTyped/DefinitelyTyped/pull/1385

A level of flux meant that for a long time Definitely Typed committed only to support the latest version of TypeScript and the latest version of packages therein. These days it's not so brutal, but then it had to be as a matter of necessity.

The compiler was changing too fast and there were too few people involved to allow for any realistic alternative. As is often the case in software development, it was "good enough". Any other choice would probably have increased the workload of maintainers to a point where the project would no longer be a going concern. It was a choice with downsides; trade-offs. But it was the choice that best served the future of Definitely Typed and TypeScript.

Masahiro Wakame#

Time passed. Autumn turned into winter, winter into spring. TypeScript reached 1.0. It wasn't beta anymore. As each release came, the changes in the compiler became more gradual. This was a blessing for the Definitely Typed team. The projects popularity was ticking up and up. New definitions were added each day. The trickle of issues and PRs had become a stream, then a river. A river very much ready to burst its banks.

It was taking its toll. Inside Definitely Typed roles were shifting. Boris was starting to step back from day to day reviewing of PRs. New members were joining the project, like Igor Oleinikov. But the pace was insatiable.

Some people left the project entirely, burned out by the never ending issues and PRs. Basarat started contributing less, beginning to turn his attention to one of his many sidejams. Fortunately, it turned out that before Basarat stepped back, he had done a very fine thing. In Tokyo, Japan was a 28 year old developer named Masahiro Wakame.

Mas was using JS to build the web applications he worked on. But ECMAScript 5 wasn't hitting the mark for him. For a time Masahiro used CoffeeScript (Jeremy Ashkenas Ruby style JS alternative). He liked it, but, as he put it: "I was shooting my foot everyday". Looking out for that elusive solution he landed on Dart. It looked amazing. But it wasn't ECMAScript. Masahiro worried he'd be locked in. He'd built some libraries and a testing framework using Dart. But he didn't feel he could suggest that his company adopted it; it was too different and only he knew it. He was left with the "what if I go under a bus?" problem. If he left the company, his colleagues would find it hard to move away from using Dart. This made him very hesitant. He didn't feel he could justify the choice.

Then Masahiro heard about TypeScript. Like Goldilocks and the three bears, this third language sounded just right. He loved the type safety. It also had a compelling proposition: the transpiled JS that TypeScript generated was human readable and idiomatic. Generating idiomatic JS as opposed to some kind of strange byte code was a goal of the language from the early days, as Anders Hejlsberg would repeatedly explain. This generation of "real JS" made test driving TypeScript a low risk proposition. One that appealed to the likes of Masahiro. No lock-in. You decide TypeScript isn't for you? Fine. Take the generated JS files and shake the TypeScript dust off your sandals. Masahiro consequently went all in on TypeScript. This was his bet. And he was going to cover his bet by trying to make the ecosystem even stronger.

Masahiro started out trying to improve the testing framework in DT; sending in pull requests. Before too long, Basarat messaged him to say "do you wanna become a committer?" Masahiro became a committer.

It turned out that MH had a special qualities that DT was going to sorely need: he was willing and able to review PR after PR, day after day. His stamina was incredible.

Whilst it may not have been obvious from the outside, by now Definitely Typed was a slightly troubled project. The speed at which issues and PRs landed was relentless. Anyone who had once set GitHub to "watching" for Definitely Typed soon unsubscribed. It was becoming unmanageable. And whilst almost everyone else in the project was in the process of burning out / moving on / stepping back and similar, Masahiro kept going. He kept showing up. He kept reviewing. He kept merging. At his peek he was spending 2 hours a day, every day, glued to his screen in Tokyo and reviewing PRs for GitHub. The pulse of Definitely Typed may have slowed. But Masahiro kept the heart beating.

As Masahiro kept the lights on, in a hotel room in Buenos Aires an Australian named Blake Embrey was making plans...

Blake Embrey#

Blake was a 21 year old Australian. He was a nomadic developer, travelling around the world and working remotely. He travelled from country to country armed with a suitcase and his trusty MacBook in search of WiFi. He found himself dialing into standups from cafés in Vietnam at 1am to provide Jira updates, coding from airports as he criss-crossed the globe. It was an unusual life.

A friend showed Blake TypeScript somewhere around the TypeScript 1.2 era. He was interested. He was mostly working on backend NodeJS at the time. He could see the potential that TypeScript had to help him. Around the TypeScript 1.5 era Blake started to take a really good look at what was possible. From his vantage point, there was good and there was also bad. And he thought he could help.

As a module author and developer, he loved TypeScript. It allowed him to write, publish and consume 100% type-safe JavaScript. Features like autocompletion, type validation and ES6 features became part of his typical workflow. It was so good!

However, one step in this development lifecycle was broken to his mind. The problem was module shaped. Yeah, modules. Wade into the controversy!

Thanks again to Basarat, Blake soon became a DT contributor. Of all the contributors to Definitely Typed, Blake was the first one who was looking hard at the module problem. This was because whilst he wanted TypeScript to solve the same problems as everyone else, he wanted to solve them in a world of package dependencies. He wanted to solve for Node.

Now it's worth taking a moment to draw a comparison between web development then, and web development now. Because it's changed. The phrase "web development fatigue" exists with good reason. Web development in 2014 as compared to web development in 2019 is a very different proposition. Historically, JavaScript has not had a good story around modularisation. The language meandered forwards without ever gaining an official approach to modularisation until ES6. So for twenty years, if you wanted to write a large JS application you had to think hard about how to solve this problem. And even when modules were nailed down it was longer still until module loading was standardised.

But that didn't stop us. JavaScript apps were still being built on the frontend and the backend. On the frontend an approach to modularisation emerged called the Asynchronous Module Definition. It had some adoption but in the main that wasn't how people rolled. The frontend was generally a sea of global variables. People would write programs that depended upon global variables and worked hard to make sure that they didn’t collide. Everything did this. Everything. Underscore? It was a global variable called _. jQuery? It was two global variables: $ and jQuery. That's just what people did. I'm a person. I did that. If you were there you probably did too.

On the server side, in Node JS land, a different standard had emerged: CommonJS. Unlike AMD, CommonJS was simply how the Node JS community worked. Everything was a CommonJS module. Alongside Node, npm was growing and growing. Exposing Node developers to a rich ecosystem of modules or packages that they could drop into their apps with merely a tap tap tap of npm install super-cool-package and then var scp = require('super-cool-package').

And therein, as the Bard would have it, lay the rub. You see, in the frontend it was simpler. Uglier but simpler. By and large, the global variables were fine. They weren't beautiful but they were functional. It may have impaired the development of frontend apps, but it certainly didn't stop it.

And since a design goal of TypeScript was to meet JavaScript developers where they were and try and make their lives better, the initial focus of Definitely Typed was necessarily types that existed in the global namespace. So jquery.d.ts would declare global $ and jQuery variables and underneath them all the jQuery methods and variables that were implemented. Alongside jQuery, maybe an application would have jQuery UI which would extend the $ variable and add extra functionality. In addition maybe there'd be a couple of jQuery plugins in play too. (It's worth saying that jQuery was the crack cocaine of web development back in the day. People just couldn't get enough.)

TypeScript catered for this world by allowing type definitions to extend interfaces created by other definition files. The focus of most of the Definitely Typed contributors up to this point was frontend and hence DT was an ocean of global type definitions.

Of course, this is not what the frontend world looks like these days. The frontend now is all about npm thanks to tools like Browserify, webpack, Rollup and the like. Client and server side development is mighty similar these days. Or at least, it's swimming in more of the same waters. There's a good TypeScript story to tell about this as well. But there wasn't always. Back to Blake.

Blake had published a bunch of modules on npm. But no one had ever been able to consume the type definitions from them. Why was that? Well, without delving into great detail it comes down to type definitions of a package generally conflicting with type definitions that a user installs themselves.

This essentially came down to how TSD worked and what Definitely Typed contained. TSD was a pretty simple tool; by and large it worked by copying files from Definitely Typed into a users project. The files copied would contain type definitions which contained global types. So even though you cared solely about external modules, because of Definitely Typed you found yourself installing globals alongside which lead to conflicts between different type definitions. Different type definitions punching it out whilst the TypeScript compiler stood in between ineffectually shouting "leave it alone mate - it's not worth it!"

How could we have a world where external modules and global were treated distinctly? Blake had ideas… Plan one was to rewrite TSD to support external modules; the type of modules that were standard in Node land. After working hard on that for some time, Blake came to conclusion that solving global variables alongside external modules was a hard problem. A very hard problem. And perhaps that just running with external modules, a new start if you will, represented the best way forwards.

Typings#

Blake made typings. Typings was a number of things; it was a new command line tool to replace TSD, it was a new approach to distributing type definitions and it was a registry. But Typings was a registry which pointed out to the web. Typings installation was entirely decentralized and the typings themselves could be downloaded from almost anywhere - GitHub, NPM, Bower and even over HTTP or the filesystem. Those type definitions could be external modules or globals.

It was radical. From centralisation to decentralisation. As Blake described it:

This decentralization solves the biggest pain point I see with maintaining DefinitelyTyped. How does an author of one typings package maintain their file in DefinitelyTyped when they get notifications on thousands of others? How do you make sure typings maintain quality when you have 1000s to review? The solution in typings is you don’t, the community does. If typings are incorrect, I can just write and install my own from wherever I want, something that TSD doesn’t really allow. There’s no merge or review process you need to wait for (300+ open pull requests!).

However, decentralization comes with the cost of discoverability. To solve this, a registry exists that maintains locations of where the best typing can currently be installed from, for any version. If there’s a newer typing, patches, or the old typing author has somehow disappeared, you can replace the entry with your own so people will be directed to your typings from now on.

The world started to use Typings as the default CLI for type definitions. typings.json files started appearing in people's repos. Typings allowed consumption of types both from the Typings registry and from Definitely Typed and so there was an easy on ramp for people to start using Typings.

Little by little, people started consuming type definitions that came from the typings registry rather than from Definitely Typed. Typings began to thrive whilst DT continued to choke. The community was beginning to diverge.

The TypeScript Team#

Over in Seattle, the TypeScript team was thinking hard about the type definition ecosystem. About Definitely Typed and Typings. And about tooling and distribution.

At this point, there wasn't a dedicated registry for type definitions. There was GitHub. By and large, all type definitions lived in GitHub. Since GitHub is a git based source control provider it was possible for it to be used as a makeshift registry. So that's exactly what Definitely Typed and Typings were doing; piggy backing on GitHub and MacGyvering "infrastructure". It worked.

There wasn't a great versioning story. Definitely Typed just didn't do versioning. The latest and greatest was supported. Nothing else. The Typings approach was more nuanced. It did have an approach for versioning. It supported it by dint of allowing a version number in the registry to point to a specific git hash in a repo. It was an elegant and smart approach. Blake Embrey was one sharp cat.

Innovative though it was, the decentralised Typings approach presented potential security risks as it pointed out to the web making auditing harder. Alongside this, The TypeScript team was pondering ways they could reduce friction for developers that wanted to use TypeScript.

By now, the JavaScript ecosystem had started to coalesce around npm as the registry du jour. Bower and jspm were starting to fade in popularity. NuGet (the .NET package manager) was no longer being encouraged as a place to house JS. npm was standard. TypeScript users found themselves using npm to install jQuery and reaching for tsd or Typings to install the associated type definitions. That's two commands. With two package managers. Each with subtly different syntax. And then perhaps you had to fiddle with the tsconfig.json to get the compiler looking in the right places. It worked. But it didn't feel …. idiomatic. It didn't feel like TypeScript was meeting their users where they were.

The likes of Daniel Rosenwasser, Mohamed Hegazy and Ryan Cavanaugh found themselves pondering the problem. Alongside this, they were thinking more about what a first class module support experience in TypeScript would look like, motivated in part by the critical mass around npm, which was entirely module / package based.

That wasn’t the only thing on their minds; there was also the testing story. Definitely Typed had a straightforward testing story due to being a real mega-repo. Everything lived together and could be tested together. Thanks to the hard work of the Definitely Typed team this was already in place; every PR spun up Travis and tested all the type definitions individually and in concert with one another. Typings didn’t have this. What’s more, it would be hard to build. The decentralised nature of Typings meant that you’d need to build infrastructure to crawl the Typings registry, download the type definitions and then perform the tests. It was non-trivial and unlikely to be speedy.

There was one more factor in play. The TypeScript team were aware that for the longest time they'd been working on the language. But they'd become distant from one of the most significant aspects of how the language was used. They weren’t well enough informed about the rough edges in the type definition space. They weren’t feeling their users pain. They needed to address this and really there was only one thing to do... It was time for the TypeScript team to start eating their own dogfood.

A Plan Emerges#

The TypeScript team reached out to Blake Embrey and started to talk about ways forward. They started collaborating over Slack.

The TypeScript team had also been in contact with the Definitely Typed team. They were, at this point, aware that Definitely Typed was being kept going mainly due to the hard graft of Masahiro Wakame. As Daniel observed “vvakame was a champ”.

At this point I have to stick my own hand up and confess to thinking that Definitely Typed was not long for this world. Steve Ognibene (another DT member) and others were all feeling similarly. It seemed inevitable.

The TypeScript team were about to change that. After talking, thinking, thinking and talking they put together a plan. It was going to change TypeScript and change Definitely Typed. It was also going to effectively end Typings.

It’s worth saying at this point that the TypeScript team didn’t enter into this lightly. They were hugely impressed by Typings. It was, to quote Daniel Rosenwasser, “an impressive piece of work”. It also had the most amazing command line experience. Everyone on the team felt that it was an incredible endeavour and had their proverbial hats off to Blake Embrey. But Definitely Typed had critical mass and, whilst it had known problems, they were problems that could be likely solved (or ameliorated) through automation. The Typings approach was very innovative, but it presented other issues which seemed harder to solve. The TypeScript team made a bet. They placed their money on Definitely Typed.

To remove friction in the type acquisition space they decided to change the compiler. It would now look out for a special scoped namespace on npm named @types. Type definitions from Definitely Typed would be published out to @types. They would land as type definition packages that matched the non @types package. This meant that TypeScript was now sharing the same infrastructure as the rest of the JS ecosystem: npm. And consequently, installation of a package like jQuery in a TypeScript workflow now looked like this: npm install jquery @types/jquery. One command, one tool, one registry.

They published their plans here: https://github.com/Microsoft/TypeScript/issues/9184

There was more. The TypeScript team had really enjoyed knowing that this open source project which ran completely independently from the TypeScript team existed. And whilst they were focused directly on the language that was reasonable. But with the changes that were being planned, TypeScript was about to start explicitly depending upon Definitely Typed. It had been unofficially true up until that point. But now it was different; TypeScript were going to automate publishing Definitely Typed packages to the special @types scope in npm which the TypeScript compiler gave preference to. TypeScript and Definitely Typed were going from dating to being engaged.

It was time for the TypeScript team to get involved.

The team committed to doing weekly rotations of a TypeScript team member working on Definitely Typed. Reviewing PRs, merging them and, crucially, helping with automation and testing.

TypeScript was now part of Definitely Typed. Definitely Typed was part of TypeScript.

TypeScript 2.0 / Definitely Typed 2.0#

Blake was immensely disappointed. He'd put his heart and soul into Typings. It was a massive amount of work and he'd not only started a project, he'd started a community that he felt responsible for.

Although that work had arguably kickstarted the discussion of what the future of type acquisition in TypeScript should look like, Typings wouldn't be coming along for the ride. It was a burner rocket, carrying the good ship TypeScript into outer orbit, dropping back to Earth once it's job was done.

Very much, Blake had in mind all the people that had contributed to Typings. That all their work was going to be abandoned. He felt a sense of responsibility. It was both frustrating and heartbreaking.

When TypeScript 2.0 shipped, in the release announcement was the following statement: https://devblogs.microsoft.com/typescript/announcing-typescript-2-0/

We’d like to thank Blake Embrey for his work on Typings and helping us bring this solution forward.

Blake really appreciated the recognition. In years to come Blake would come to feel that the decisions made were the right ones. That they lead to TypeScripts continued success and served the community well. But he has regrets. He says now "I am disappointed we didn't get to integrate the two philosophies for managing types. It hurt Typings registry contributors without a story in place, I didn’t want to let down and alienate potential contributors of type definitions."

A young Australian man had helped change the direction of TypeScript. It was time for him to take a well earned rest.

In the meantime, the TypeScript team was starting to get stuck into the work of giving Definitely Typed a make-over.

At this point, Definitely Typed had more than 500 open pull requests. Most of which had been open for a very long time. The most urgent and pressing problem was getting that down. The TypeScript team committed to, in perpetuity, a weekly rotation where one team member would review PRs. This would, in future, mean that PRs were handled in a timely fashion and that the number of open PRs was generally kept beneath 100.

Alongside this, changes were being made to the TypeScript compiler. In large part these related to enabling automatic type acquisition through the @types scope. To make that work, the TypeScript team realised pretty quickly that many of the type definitions would not work as is. Ryan wrote up this report:

At this point in time there were around 1700 type definitions. Pretty much all of them required some massaging. Roughly speaking, with TS 2.0, the language was going to move from a name based type acquisition approach to a file based one. New features were added to TypeScript 2.0 such as the export as namespace syntax to support a type definition supporting both being used in modules (where there are import / exports) but also in script files (where there aren't)

Ryan Cavanaugh put together scripts that migrated 1200 of the type definitions to TypeScript 2.0 syntax. The remaining 500 were delicately transitioned by hand by diligent TypeScript team members. It was a task of utter drudgery that still sparks flickers of PTSD in those who were involved. It was like being in the digital equivalent of a Dickensian workhouse.

This was one of the reasons why going with the centralised approach of Definitely Typed instead of the decentralised one of Typings was necessary. Because the TypeScript team were involved in DT they could help make things happen. They could do the hard work. In a decentralised world that wouldn't be possible; everything would constantly be held up, waiting.

It took a long time to get the types 2.0 branch to a point where CI went green. All this time, merges we're taking place between the master branch and the future one. It was hard, unglamorous work. As Ryan put it, "I partied hard when CI went green for the first time on types 2.0."

The first and most obvious addition was the automation of TypeScript definitions being published out to npm.

Next came a solution for the "notification flood" issue. It was no longer feasible for a user to have Definitely Typed set up as "watching" in GitHub. That way lead an unstoppable deluge of information about issues and pull requests. The result of that was that users were generally unaware of changes / issues and so on. People, as much as they wanted to be, were becoming disconnected from the type definitions they were interested in in DT.

The solution for this problem was, as with so many problems, a bot. It would send notifications to the users who had historically worked on a type definition when someone sent a PR. This was hugely useful. It made it possible for people to become effective stewards of the type definitions they knew about. It meant people could effectively remain involved with DT; giving them targeted information. It was the solution to a communications problem.

As Ryan Cavanaugh put it when he looked back upon TypeScripts story, he had this to say: “Definitely Typed is the best thing that could exist from our perspective”.

He was speaking from the perspective of a TypeScript team member. He could as well be speaking for the developer world at large. Definitely Typed is an organic monster of open source goodness; bringing types to the world thanks to nearly 10,000 contributors. Each person of which has donated at least an hour or their time for the greater good. Far more than that in many cases. It’s incredible. I’m glad I get to be part of it. I never would have guessed it would have turned out like this.

Start Me Up: ts-loader meet .tsbuildinfo

With TypeScript 3.4, a new behaviour landed and a magical new file type appeared; .tsbuildinfo

TypeScript 3.4 introduces a new flag called --incremental which tells TypeScript to save information about the project graph from the last compilation. The next time TypeScript is invoked with --incremental, it will use that information to detect the least costly way to type-check and emit changes to your project.

...

These .tsbuildinfo files can be safely deleted and don’t have any impact on our code at runtime - they’re purely used to make compilations faster.

This was all very exciting, but until the release of TypeScript 3.6 there were no APIs available to allow third party tools like ts-loader to hook into them. The wait is over! Because with TypeScript 3.6 the APIs landed: https://www.typescriptlang.org/docs/handbook/release-notes/typescript-3-6.html#apis-to-support---build-and---incremental

This was the handiwork of the very excellent @sheetalkamat of the TypeScript team - you can see her PR here: https://github.com/microsoft/TypeScript/pull/31432

What's more, Sheetal took the PR for a test drive using ts-loader, and her hard work has just shipped with <a href="https://github.com/TypeStrong/ts-loader/releases/tag/v6.2.0">v6.2.0</a>:

If you're a ts-loader user, and you're using TypeScript 3.6+ then you can get the benefit of this now. That is, if you make use of the experimentalWatchApi: true option. With this set:

  1. ts-loader will both emit and consume the .tsbuildinfo artefact.

  2. This applies both when a project has tsconfig.json options composite or incremental set to true.

  3. The net result of people using this should be faster cold starts in build time where a previous compilation has taken place.

ts-loader v7.0.0#

We would love for you to take this new functionality for a spin. Partly because we think it will make your life better. And partly because we're planning to make using the watch API the default behaviour of ts-loader when we come to ship v7.0.0.

If you can take this for a spin before we make that change we'd be so grateful. Thanks so much to Sheetal for persevering away on this feature. It's amazing work and so very appreciated.

Coming Soon: Definitely Typed

A long time ago (well, 2012) in a galaxy far, far away (okay; Plovdiv, Bulgaria)....

Definitely Typed began!

This is a project that set out to provide type definitions for every JavaScript library that lacked them. An ambitious goal. Have you ever wondered what the story that lay behind it was?

Perhaps you know that the project was started by a shadowy figure named "Boris Yankov". And maybe you know that the TypeScript team is now part of the Definitely Typed team. There's a lot more to tell.

This autumn, I'd like to tell you the story of how Definitely Typed came to be what it is. From an individual commit in a repo that Boris created in 2012 to the number 10 project by contributions on GitHub in 2018. I'm part of that story. Basarat Ali Syed is part of that story. Masahi Wakame too. Blake Embrey. Steve Fenton. Igor Oleinikov. It's an amazing and unexpected tale. One that turns upon the actions of individuals. They changed your life and I'd love you to learn how.

So, coming soon to a blog post near you, is the story of Definitely Typed. It's very exciting! Stay tuned...

Symbiotic Definitely Typed

I did ponder calling this post "how to enable a good TypeScript developer experience for npm modules that aren't written in TypeScript"... Not exactly pithy though.

Definitely Typed is the resource which allows developers to use TypeScript with existing JavaScript libraries that ship without their own type definitions.

DT began as a way to enable interop between JS and TS. When DT started, everything on npm was JavaScript. Over time it has become more common for libraries (eg Mobx / Angular) to be written (or rewritten) in TypeScript. For publishing, they are compiled down to JS with perfect type definitions generated from the TypeScript alongside the compiled JavaScript. These libraries do not need to exist in Definitely Typed anymore.

Another pattern that has emerged over time is that of type definitions being removed from Definitely Typed to live and be maintained alongside the libraries they support. An example of this is MomentJS.

This week, I think for the first time, there emerged another approach. Kent C Dodds' react-testing-library had started out with the MomentJS approach of hosting type definitions alongside the JavaScript source code. Alex Krolic raised a PR which proposed removing the type definitions from the RTL repo in favor of having the community maintain them at DefinitelyTyped.

I'll directly quote Kent's explanation of the motivation for this:

We were getting a lot of drive-by contributions to the TypeScript typings and many pull requests would either sit without being reviewed by someone who knows TypeScript well enough, or be merged by a maintainer who just hoped the contributor knew what they were doing. This resulted in a poor experience for TypeScript users who could experience type definition churn and delays, and it became a burden on project maintainers as well (most of us don't know TypeScript very well). Moving the type definitions to DefinitelyTyped puts the maintenance in much more capable hands.

I have to admit I was reticent about this idea in the first place. I like the idea that types ship with the package they support. It's a good developer experience; users install your package and it works with TypeScript straight out of the box. However Alex's PR addressed a real issue: what do you do when the authors of a package aren't interested / equipped / don't have the time to support TypeScript? Or don't want to deal with the noise of TypeScript related PRs which aren't relevant to them. What then?

Alex was saying, let's not force it. Let the types and the library be maintained separately. This can and is done well already; React is a case in point. The React team does not work on the type definitions for React, that's done (excellently) by a crew of dedicated React lovers in Definitely Typed.

It's a fair point. The thing that was sad about this move was that the developer experience was going to have more friction. Users would have to yarn add -D @testing-library/react and then subsequently yarn add -D @types/testing-library__react to get the types.

This two step process isn't the end of the world, but it does make it marginally harder for TypeScript users to get up and running. It reduces the developer joy. As a side note, this is made more unlovely by @testing-library/react being a scoped package. Types for a scoped package have a quirky convention for publishing. A fictional scoped package of @foo/bar would be published to npm as: @types/foo__bar. This is functional but non-obvious; it's tricky to discover. A two step process instead of a one step process is a non-useful friction that it would be great to eliminate.

Fortunately, Kent and Daniel K had one of these moments:

Kent suggested that at the same time as dropping the type definitions that were shipped with the library, we try making @types/testing-library__react a dependency of @testing-library/react. This would mean that people installing @testing-library/react would get @types/testing-library__react installed automatically. So from the developers point of view, it's as though the type definitions shipped with the package directly.

To cut a long story short reader, that's what happened. If you're using @testing-library/react from 9.1.2 you're getting Definitely Typed under the covers. This was nicely illustrated by Kent showing what the TypeScript consumption experience looked like before the Definitely Typed switch:

And here's what it looked like after:

Identical! i.e it worked. I grant you this is one of the more boring before / after comparisons there is… But hopefully you can see it demonstrates that this is giving us exactly what we need.

To quote Kent once more:

By adding the type definitions to the dependencies of React Testing Library, the experience for users is completely unchanged. So it's a huge improvement for the maintenance of the type definitions without any breaking changes for the users of those definitions.

This is clearly an approach that's useful; it adds value. It would be tremendous to see other libraries that aren't written in TypeScript but would like to enable a good TypeScript experience for those people that do use TS also adopting this approach.

Update: Use a Loose Version Range in package.json#

When I tweeted this article it prompted this helpful response from Andrew Branch of the TypeScript team:

> use a loose version range This is my advice as well and should probably be mentioned in the article TBH.

— Kent C. Dodds (@kentcdodds) August 18, 2019

Andrew makes the useful point that if you are adding support for TypeScript via an @types/... dependency then it's wise to do so with a loose version range. In the case of RTL we did it like this:

"@types/testing-library__react": "^9.1.0"

i.e. Any type definition with a version of 9.1 or greater (whilst still lower than 10.0.0) is considered valid. You could go even looser than that. If you really don't want to think about TypeScript beyond adding the dependency then a completely loose version range would do:

"@types/testing-library__react": "*"

This will always install the latest version of the @types/testing-library__react dependency and (importantly) allow users to override if there's a problematic @types/testing-library__react out there. This level of looseness is not really advised though. As in the scenario when a library (and associated type definitions) do a major release, users of the old major would get the wrong definitions by default when installing or upgrading (in range).

Probably the most helpful approach is the approach followed by RTL; fixing the major version but allowing all minor and patch releases inside a major version.

Update 2: Further Discussions!#

The technique used in this blog post sparked an interesting conversation with members of the TypeScript team when it was applied to <a href="https://github.com/testing-library/jest-dom">https://github.com/testing-library/jest-dom</a>. The conversation can be read here.

Using TypeScript and ESLint with webpack (fork-ts-checker-webpack-plugin new feature!)

The fork-ts-checker-webpack-plugin has, since its inception, performed two classes of checking:

  1. Compilation errors which the TypeScript compiler surfaces up
  2. Linting issues which TSLint reports

You may have caught the announcement that TSLint is being deprecated and ESLint is the future of linting in the TypeScript world. This plainly has a bearing on linting in fork-ts-checker-webpack-plugin.

I've been beavering away at adding support for ESLint to the fork-ts-checker-webpack-plugin. I'm happy to say, the plugin now supports ESLint. Do you want to get your arms all around ESLint with fork-ts-checker-webpack-plugin? Read on!

How do you migrate from TSLint to ESLint?#

Well, first of all you need the latest and greatest fork-ts-checker-webpack-plugin. Support for ESLint shipped with v1.4.0.

You need to change the options you supply to the plugin in your webpack.config.js to look something like this:

new ForkTsCheckerWebpackPlugin({ eslint: true })

You'll also need the various ESLint related packages to your package.json:

yarn add eslint @typescript-eslint/parser @typescript-eslint/eslint-plugin --dev
  • eslint
  • @typescript-eslint/parser: The parser that will allow ESLint to lint TypeScript code
  • @typescript-eslint/eslint-plugin: A plugin that contains ESLint rules that are TypeScript specific

If you want, you can pass options to ESLint using the eslintOptions option as well. These will be passed through to the underlying ESLint CLI Engine when it is instantiated. Docs on the supported options are documented here.

Go Configure#

Now you're ready to use ESLint, you just need to give it some configuration. Typically, an .eslintrc.js is what you want here.

const path = require('path');
module.exports = {
parser: '@typescript-eslint/parser', // Specifies the ESLint parser
plugins: [
'@typescript-eslint',
],
env: {
browser: true,
jest: true
},
extends: [
'plugin:@typescript-eslint/recommended', // Uses the recommended rules from the @typescript-eslint/eslint-plugin
],
parserOptions: {
project: path.resolve(__dirname, './tsconfig.json'),
tsconfigRootDir: __dirname,
ecmaVersion: 2018, // Allows for the parsing of modern ECMAScript features
sourceType: 'module', // Allows for the use of imports
},
rules: {
// Place to specify ESLint rules. Can be used to overwrite rules specified from the extended configs
// e.g. "@typescript-eslint/explicit-function-return-type": "off",
'@typescript-eslint/explicit-function-return-type': 'off',
'@typescript-eslint/no-unused-vars': 'off',
},
};

If you're a React person (and I am!) then you'll also need: yarn add eslint-plugin-react. Then enrich your eslintrc.js a little:

const path = require('path');
module.exports = {
parser: '@typescript-eslint/parser', // Specifies the ESLint parser
plugins: [
'@typescript-eslint',
'react'
// 'prettier' commented as we don't want to run prettier through eslint because performance
],
env: {
browser: true,
jest: true
},
extends: [
'plugin:@typescript-eslint/recommended', // Uses the recommended rules from the @typescript-eslint/eslint-plugin
'prettier/@typescript-eslint', // Uses eslint-config-prettier to disable ESLint rules from @typescript-eslint/eslint-plugin that would conflict with prettier
// 'plugin:react/recommended', // Uses the recommended rules from @eslint-plugin-react
'prettier/react', // disables react-specific linting rules that conflict with prettier
// 'plugin:prettier/recommended' // Enables eslint-plugin-prettier and displays prettier errors as ESLint errors. Make sure this is always the last configuration in the extends array.
],
parserOptions: {
project: path.resolve(__dirname, './tsconfig.json'),
tsconfigRootDir: __dirname,
ecmaVersion: 2018, // Allows for the parsing of modern ECMAScript features
sourceType: 'module', // Allows for the use of imports
ecmaFeatures: {
jsx: true // Allows for the parsing of JSX
}
},
rules: {
// Place to specify ESLint rules. Can be used to overwrite rules specified from the extended configs
// e.g. "@typescript-eslint/explicit-function-return-type": "off",
'@typescript-eslint/explicit-function-return-type': 'off',
'@typescript-eslint/no-unused-vars': 'off',
// These rules don't add much value, are better covered by TypeScript and good definition files
'react/no-direct-mutation-state': 'off',
'react/no-deprecated': 'off',
'react/no-string-refs': 'off',
'react/require-render-return': 'off',
'react/jsx-filename-extension': [
'warn',
{
extensions: ['.jsx', '.tsx']
}
], // also want to use with ".tsx"
'react/prop-types': 'off' // Is this incompatible with TS props type?
},
settings: {
react: {
version: 'detect' // Tells eslint-plugin-react to automatically detect the version of React to use
}
}
};

You can add Prettier into the mix too. You can see how it is used in the above code sample. But given the impact that has on performance I wouldn't recommend it; hence it's commented out. There's a good piece by Rob Cooper's for more details on setting up Prettier and VS Code with TypeScript and ESLint.

Performance and Power Tools#

It's worth noting that support for TypeScript in ESLint is still brand new. As such, the rule of "Make it Work, Make it Right, Make it Fast" applies.... ESLint with TypeScript still has some performance issues which should be ironed out in the fullness of time. You can track them here.

This is important to bear in mind as, when I converted a large codebase over to using ESLint, I discovered that initial performance of linting was terribly slow. Something that's worth doing right now is identifying which rules are costing you most timewise and tweaking based on whether you think they're earning their keep.

The TIMING environment variable can be used to provide a report on the relative cost performance wise of running each rule. A nice way to plug this into your workflow is to add the cross-env package to your project: yarn add cross-env -D and then add 2 scripts to your package.json:

"lint": "eslint ./",
"lint-rule-timings": "cross-env TIMING=1 yarn lint"
  • lint - just runs the linter standalone
  • lint-rule-timings - does the same but with the TIMING environment variable set to 1 so a report will be generated.

I'd advise, making use of lint-rule-timings to identify which rules are costing you performance and then turning off rules as you need to. Remember, different rules have different value.

Finally, if you'd like to see how it's done, here's an example of porting from TSLint to ESLint.

TypeScript / webpack - you down with PnP? Yarn, you know me!

Yarn PnP is an innovation by the Yarn team designed to speed up module resolution by node. To quote the (excellent) docs:

Plug’n’Play is an alternative installation strategy unveiled in September 2018...

The way regular installs work is simple: Yarn generates a node_modules directory that Node is then able to consume. In this context, Node doesn’t know the first thing about what a package is: it only reasons in terms of files. “Does this file exist here? No? Let’s look in the parent node_modules then. Does it exist here? Still no? Too bad… parent folder it is!” - and it does this until it matches something that matches one of the possibilities. That’s vastly inefficient.

When you think about it, Yarn knows everything about your dependency tree - it evens installs it! So why is Node tasked with locating your packages on the disk? Why don’t we simply query Yarn, and let it tell us where to look for a package X required by a package Y? That’s what Plug’n’Play (abbreviated PnP) is. Instead of generating a node_modules directory and leaving the resolution to Node, we now generate a single .pnp.js file and let Yarn tell us where to find our packages.

Yarn has been worked upon, amongst others, by the excellent Maël Nison. You can hear him talking about it in person in this talk at JSConfEU.

Thanks particularly to Maël's work, it's possible to use Yarn PnP with TypeScript using webpack with ts-loader andfork-ts-checker-webpack-plugin. This post intends to show you just how simple it is to convert a project that uses either to work with Yarn PnP.

Vanilla ts-loader#

Your project is built using standalone ts-loader; i.e. a simple setup that handles both transpilation and type checking.

First things first, add this property to your package.json: (this is only required if you are using Yarn 1; this tag will be optional starting from the v2, where projects will switch to PnP by default.)

{
"installConfig": {
"pnp": true
}
}

Also, because this is webpack, we're going to need to add an extra dependency in the form of pnp-webpack-plugin:

yarn add -D pnp-webpack-plugin

To quote the excellent docs, make the following amends to your webpack.config.js:

const PnpWebpackPlugin = require(`pnp-webpack-plugin`);
module.exports = {
module: {
rules: [{
test: /\.ts$/,
loader: require.resolve('ts-loader'),
options: PnpWebpackPlugin.tsLoaderOptions(),
}],
},
resolve: {
plugins: [ PnpWebpackPlugin, ],
},
resolveLoader: {
plugins: [ PnpWebpackPlugin.moduleLoader(module), ],
},
};

If you have any options you want to pass to ts-loader, just pass them as parameter of pnp-webpack-plugin's tsLoaderOptions function and it will take care of forwarding them properly. Behind the scenes the tsLoaderOptions function is providing ts-loader with the options necessary to switch into Yarn PnP mode.

Congratulations; you now have ts-loader functioning with Yarn PnP support!

fork-ts-checker-webpack-plugin with ts-loader#

You may well be using fork-ts-checker-webpack-plugin to handle type checking whilst ts-loader gets on with the transpilation. This workflow is also supported using pnp-webpack-plugin. You'll have needed to follow the same steps as the ts-loader setup. It's just the webpack.config.js tweaks that will be different.

const PnpWebpackPlugin = require(`pnp-webpack-plugin`);
module.exports = {
plugins: {
new ForkTsCheckerWebpackPlugin(PnpWebpackPlugin.forkTsCheckerOptions({
useTypescriptIncrementalApi: false, // not possible to use this until: https://github.com/microsoft/TypeScript/issues/31056
})),
}
module: {
rules: [{
test: /\.ts$/,
loader: require.resolve('ts-loader'),
options: PnpWebpackPlugin.tsLoaderOptions({ transpileOnly: true }),
}],
},
resolve: {
plugins: [ PnpWebpackPlugin, ],
},
resolveLoader: {
plugins: [ PnpWebpackPlugin.moduleLoader(module), ],
},
};

Again if you have any options you want to pass to ts-loader, just pass them as parameter of pnp-webpack-plugin's tsLoaderOptions function. As we're using fork-ts-checker-webpack-plugin we're going to want to stop ts-loader doing type checking with the transpileOnly: true option.

We're now initialising fork-ts-checker-webpack-plugin with pnp-webpack-plugin's forkTsCheckerOptions function. Behind the scenes the forkTsCheckerOptions function is providing the fork-ts-checker-webpack-plugin with the options necessary to switch into Yarn PnP mode.

And that's it! You now have ts-loader and fork-ts-checker-webpack-plugin functioning with Yarn PnP support!

Living on the Bleeding Edge#

Whilst you can happily develop and build using Yarn PnP, it's worth bearing in mind that this is a new approach. As such, there's some rough edges right now.

If you're interested in Yarn PnP, it's worth taking the v2 of Yarn (Berry) for a spin. You can find it here: https://github.com/yarnpkg/berry. It's where most of the Yarn PnP work happens, and it includes zip loading - two birds, one stone!

Because there isn't first class support for Yarn PnP in TypeScript itself yet, you cannot make use of the Watch API through fork-ts-checker-webpack-plugin. (You can read about that issue here)

As you've likely noticed, the webpack configuration required makes for a noisy webpack.config.js. Further to that, VS Code (which is powered by TypeScript remember) has no support for Yarn PnP yet and so will present resolution errors to you. If you can ignore the sea of red squigglies all over your source files in the editor and just look at your webpack build you'll be fine.

There is a tool called PnPify that adds support for PnP to TypeScript (in particular tsc). You can find more information here: https://yarnpkg.github.io/berry/advanced/pnpify. For tsc it would be:

$> yarn pnpify tsc [...]

The gist is that it simulates the existence of node_modules by leveraging the data from the PnP file. As such it's not a perfect fix (pnp-webpack-plugin is a better integration), but it's a very useful tool to have to unblock yourself when using a project that doesn't support it.

PnPify actually allows us to use TypeScript in VSCode with PnP! Its documentation is here: https://yarnpkg.github.io/berry/advanced/pnpify#vscode-support

All of these hindrances should hopefully be resolved in future. Ideally, one day a good developer experience can be the default experience. In the meantime, you can still dev - just be prepared for the rough edges. Here's some useful resources to track the future of support:

This last one would be nice because:

  • We'd stop having to patch require
  • We probably wouldn't have to use yarn node if Node itself was able to find the loader somehow (such as if it was listed in the package.json metadata)

Thanks to Maël for his tireless work on Yarn. To my mind Maël is certainly a candidate for the hardest worker in open source. I've been shamelessly borrowing his excellent docs for this post - thanks for writing so excellently Maël!

TypeScript and high CPU usage - watch don't stare!

I'm one of the maintainers of the fork-ts-checker-webpack-plugin. Hi there!

Recently, various issues have been raised against create-react-app (which uses fork-ts-checker-webpack-plugin) as well as against the plugin itself. They've been related to the level of CPU usage in watch mode on idle; i.e. it's high!

Why High?#

Now, under the covers, the fork-ts-checker-webpack-plugin uses the TypeScript watch API.

The marvellous John (not me - another John) did some digging and discovered the root cause came down to the way that the TypeScript watch API watches files:

TS uses internally the fs.watch and fs.watchFile API functions of nodejs for their watch mode. The latter function is even not recommended by nodejs documentation for performance reasons, and urges to use fs.watch instead.

NodeJS doc:

Using fs.watch() is more efficient than fs.watchFile and fs.unwatchFile. fs.watch should be used instead of fs.watchFile and fs.unwatchFile when possible.

"there is another"#

John also found that there are other file watching behaviours offered by TypeScript. What's more, the file watching behaviour is configurable with an environment variable. That's right, if an environment variable called TSC_WATCHFILE is set, it controls the file watching approach used. Big news!

John did some rough benchmarking of the performance of the different options that be set on his PC running linux 64 bit. Here's how it came out:

ValueCPU usage on idle
TS default (TSC_WATCHFILE not set)7.4%
UseFsEventsWithFallbackDynamicPolling0.2%
UseFsEventsOnParentDirectory0.2%
PriorityPollingInterval6.2%
DynamicPriorityPolling0.5%
UseFsEvents0.2%

As you can see, the default performs poorly. On the other hand, an option like UseFsEventsWithFallbackDynamicPolling is comparative greasy lightning.

workaround!#

To get this better experience into your world now, you could just set an environment variable on your machine. However, that doesn't scale; let's instead look at introducing the environment variable into your project explicitly.

We're going to do this in a cross platform way using <a href="https://github.com/kentcdodds/cross-env">cross-env</a>. This is a mighty useful utility by Kent C Dodds which allows you to set environment variables in a way that will work on Windows, Mac and Linux. Imagine it as the jQuery of the environment variables world :-)

Let's add it as a devDependency:

yarn add -D cross-env

Then take a look at your package.json. You've probably got a start script that looks something like this:

"start": "webpack-dev-server --progress --color --mode development --config webpack.config.development.js",

Or if you're a create-react-app user maybe this:

"start": "react-scripts start",

Prefix your start script with cross-env TSC_WATCHFILE=UseFsEventsWithFallbackDynamicPolling. This will, when run, initialise an environment variable called TSC_WATCHFILE with the value UseFsEventsWithFallbackDynamicPolling. Then it will start your development server as it did before. When TypeScript is fired up by webpack it will see this environment variable and use it to configure the file watching behaviour to one of the more performant options.

So, in the case of a create-react-app user, your finished start script would look like this:

"start": "cross-env TSC_WATCHFILE=UseFsEventsWithFallbackDynamicPolling react-scripts start",

The Future#

There's a possibility that the default watch behaviour may change in TypeScript in future. It's currently under discussion, you can read more here.

The Big One Point Oh

It's time for the first major version of fork-ts-checker-webpack-plugin. It's been a long time coming :-)

A Little History#

The fork-ts-checker-webpack-plugin was originally the handiwork of Piotr Oleś. He raised an issue with ts-loader suggesting it could be the McCartney to ts-loader's Lennon:

Hi everyone!

I've created webpack plugin: fork-ts-checker-webpack-plugin that plays nicely with ts-loader. The idea is to compile project with transpileOnly: true and check types on separate process (async). With this approach, webpack build is not blocked by type checker and we have semantic check with fast incremental build. More info on github repo :)

So if you like it and you think it would be good to add some info in README.md about this plugin, I would be greatful.

Thanks :)

We did like it. We did think it would be good. We took him up on his kind offer.

Since that time many people have had their paws on the fork-ts-checker-webpack-plugin codebase. We love them all.

One Point Oh#

We could have had our first major release a long time ago. The idea first occurred when webpack 5 alpha appeared. "Huh, look at that, a major version number.... Maybe we should do that?" "Great idea chap - do it!" So here it is; fresh out the box: v1.0.0

There are actually no breaking changes that we're aware of; users of 0.x fork-ts-checker-webpack-plugin should be be able to upgrade without any drama.

Incremental Watch API on by Default#

Users of TypeScript 3+ may notice a performance improvement as by default the plugin now uses the incremental watch API in TypeScript.

Should this prove problematic you can opt out of using it by supplying useTypescriptIncrementalApi: false. We are aware of an issue with Vue and the incremental API. We hope it will be fixed soon - a generous member of the community is taking a look. In the meantime, we will not default to using the incremental watch API when in Vue mode.

Compatibility#

As it stands, the plugin supports webpack 2, 3, 4 and 5 alpha. It is compatible with TypeScript 2.1+ and TSLint 4+.

Right that's it - enjoy it! And thanks everyone for contributing - we really dig your help. Much love.

TypeScript and webpack: Watch It

All I ask for is a compiler and a tight feedback loop. Narrowing the gap between making a change to a program and seeing the effect of that is a productivity boon. The TypeScript team are wise cats and dig this. They've taken strides to improve the developer experience of TypeScript users by introducing a "watch" API which can be leveraged by other tools. To quote the docs:

TypeScript 2.7 introduces two new APIs: one for creating "watcher" programs that provide set of APIs to trigger rebuilds, and a "builder" API that watchers can take advantage of... This can speed up large projects with many files.

Recently the wonderful 0xorial opened a PR to add support for the watch API to the fork-ts-checker-webpack-plugin.

I took this PR for a spin on a large project that I work on. With my machine, I was averaging 12 seconds between incremental builds. (I will charitably describe the machine in question as "challenged"; hobbled by one of the most aggressive virus checkers known to mankind. Fist bump InfoSec 🤜🤛😉) Switching to using the watch API dropped this to a mere 1.5 seconds!

You Can Watch Too#

0xorial's PR was merged toot suite and was been released as [email protected]. If you'd like to take this for a spin then you can. Just:

  1. Up your version of the plugin to [email protected] in your package.json
  2. Add useTypescriptIncrementalApi: true to the plugin when you initialise it in your webpack.config.js.

That's it.

Mary Poppins#

Sorry, I was trying to paint a word picture of something you might watch that was also comforting. Didn't quite work...

Anyway, you might be thinking "wait, just hold on a minute.... he said @next - I am not that bleeding edge." Well, it's not like that. Don't be scared.

fork-ts-checker-webpack-plugin has merely been updated for webpack 5 (which is in alpha) and the @next reflects that. To be clear, the @next version of the plugin still supports (remarkably!) webpack 2, 3 and 4 as well as 5 alpha. Users of current and historic versions of webpack should feel safe using the @next version; for webpack 2, 3 and 4 expect stability. webpack 5 users should expect potential changes to align with webpack 5 as it progresses.

Roadmap#

This is available now and we'd love for you to try it out. As you can see, at the moment it's opt-in. You have to explicitly choose to use the new behaviour. Depending upon how testing goes, we may look to make this the default behaviour for the plugin in future (assuming users are running a high enough version of TypeScript). It would be great to hear from people if they have any views on that, or feedback in general.

Much ❤️ y'all. And many thanks to the very excellent 0xorial for the hard work.

ts-loader Project References: First Blood

So project references eh? They shipped with TypeScript 3. We've just shipped initial support for project references in ts-loader v5.2.0. All the hard work was done by the amazing Andrew Branch. In fact I'd recommend taking a gander at the PR. Yay Andrew!

This post will take us through the nature of the support for project references in ts-loader now and what we hope the future will bring. It rips off shamelessly

borrows from the README.md documentation that Andrew wrote as part of the PR. Because I am not above stealing.

TL;DR#

Using project references currently requires building referenced projects outside of ts-loader. We don’t want to keep it that way, but we’re releasing what we’ve got now. To try it out, you’ll need to pass projectReferences: true to loaderOptions.

Like tsc, but not like tsc --build#

ts-loader has partial support for project references in that it will load dependent composite projects that are already built, but will not currently build/rebuild those upstream projects. The best way to explain exactly what this means is through an example. Say you have a project with a project reference pointing to the lib/ directory:

tsconfig.json
app.ts
lib/
tsconfig.json
niftyUtil.ts

And we’ll assume that the root tsconfig.json has { "references": { "path": "lib" } }, which means that any import of a file that’s part of the lib sub-project is treated as a reference to another project, not just a reference to a TypeScript file. Before discussing how ts-loader handles this, it’s helpful to review at a really basic level what tsc itself does here. If you were to run tsc on this tiny example project, the build would fail with the error:

error TS6305: Output file 'lib/niftyUtil.d.ts' has not been built from source file 'lib/niftyUtil.ts'.

Using project references actually instructs tscnot to build anything that’s part of another project from source, but rather to look for any .d.ts and .js files that have already been generated from a previous build. Since we’ve never built the project in lib before, those files don’t exist, so building the root project fails. Still just thinking about how tsc works, there are two options to make the build succeed: either run tsc -p lib/tsconfig.jsonfirst, or simply run tsc --build, which will figure out that lib hasn’t been built and build it first for you.

Ok, so how is that relevant to ts-loader? Because the best way to think about what ts-loader does with project references is that it acts like tsc, but not like tsc --build. If you run ts-loader on a project that’s using project references, and any upstream project hasn’t been built, you’ll get the exact same error TS6305 that you would get with tsc. If you modify a source file in an upstream project and don’t rebuild that project, ts-loader won’t have any idea that you’ve changed anything—it will still be looking at the output from the last time you built that file.

“Hey, don’t you think that sounds kind of useless and terrible?”#

Well, sort of. You can consider it a work-in-progress. It’s true that on its own, as of today, ts-loader doesn’t have everything you need to take advantage of project references in webpack. In practice, though, consuming upstream projects and building upstream projects are somewhat separate concerns. Building them will likely come in a future release. For background, see the original issue.

outDir Windows problemo.#

At the moment, composite projects built using the outDir compiler option cannot be consumed using ts-loader on Windows. If you try to, ts-loader throws a "has not been built from source file" error. You can see Andrew and I puzzling over it in the PR. We don't know why yet; it's possible there's a bug in tsc. It's more likely there's a bug in ts-loader. Hopefully it's going to get solved at some point. (Hey, maybe you're the one to solve it!) Either way, we didn't want to hold back from releasing. So if you're building on Windows then avoid building composite projects using outDir.

Ivan Drago and Definitely Typed

This a tale of things that are and things that aren't. It's a tale of semantic versioning, the lack thereof and heartbreak. It's a story of terror and failing builds. But it has a bittersweet ending wherein our heroes learn a lesson and understand the need for compromise. We all come out better and wiser people. Hopefully there's something for everybody; let's start with an exciting opener and see where it goes...

Definitely Typed#

This is often the experience people have of using type definitions from Definitely Typed:

Specifically, people are used to the idea of semantic versioning and expect it from types published to npm by Definitely Typed. They wait in vain. I've written before about the Definitely Typed / @types semantic version compromise. And I wanted to talk about it a little further as (watching the issues raised on DT) I don't think the message has quite got out there. To summarise:

  1. npm is built on top of semantic versioning and they take it seriously. When a package is published it should be categorised as a major release (breaking changes), a minor release (extra functionality which is backwards compatible) or a patch release (backwards compatible bug fixes).

  2. Definitely Typed publishes type definitions to npm under the @types namespace

  3. To make consumption of type definitions easier, the versioning of a type definition package will seek to emulate the versioning of the npm package it supports. For example, right now <a href="https://www.npmjs.com/package/react-router">react-router</a>'s latest version is 4.3.1. The corresponding type definition <a href="https://www.npmjs.com/package/@types/react-router">@types/react-router</a>'s latest version is 4.0.31. (It's fairly common for type definition versions to lag behind the package they type.)

    If there's a breaking change to the react-router type definition then the new version published will have a version number that begins "4.0.". If you are relying on semantic versioning this will break you.

I Couldn't Help But Notice Your Pain#

If you're reading this and can't quite believe that @types would be so inconsiderate as to break the conventions of the ecosystem it lives in, I understand. But hopefully you can see there are reasons for this. In the end, being able to use npm as a delivery mechanism for versioned type definitions associated with another package has a cost; that cost is semantic versioning for the type definitions themselves. It wasn't a choice taken lightly; it's a pragmatic compromise.

"But what about my failing builds? Fine, people are going to change type definitions, but why should I burn because of their choices?"

Excellent question. Truly. Well here's my advice: don't expect semantic versioning where there is none. Use specific package versions. You can do that directly with your package.json. For example replace something like this: "@types/react-router": "^4.0.0" with a specific version number: "@types/react-router": "4.0.31". With this approach it's a specific activity to upgrade your type definitions. A chore if you will; but a chore that guarantees builds will not fail unexpectedly due to changing type defs.

My own personal preference is yarn. Mother, I'm in love with a yarn.lock file. It is the alternative npm client that came out of Facebook. It pins the exact versions of all packages used in your yarn.lock file and guarantees to install the same versions each time. Problem solved; and it even allows me to keep the semantic versioning in my package.json as is.

This has some value in that when I upgrade I probably want to upgrade to a newer version following the semantic versioning convention. I should just expect that I'll need to check valid compilation when I do so. yarn even has it's own built in utility that tells you when things are out of date: yarn outdated:

So lovely

You Were Already Broken - I Just Showed You How#

Before I finish I wanted to draw out one reason why breaking changes can be a reason for happiness. Because sometimes your code is wrong. An update to a type definition may highlight that. This is analogous to when the TypeScript compiler ships a new version. When I upgrade to a newer version of TypeScript it lights up errors in my codebase that I hadn't spotted. Yay compiler!

An example of this is a PR I submitted to DefinitelyTyped earlier this week. This PR changed how react-router models the parameters of a Match. Until now, an object was expected; the user could define any object they liked. However, react-router will only produce string values for a parameter. If you look at the underlying code it's nothing more than an exec on a regular expression.

My PR enforces this at type level by changing this:

export interface match<P> {
params: P;
...
}

To this

export interface match<Params extends { [K in keyof Params]?: string } = {}> {
params: Params;

So any object definition supplied must have string values (and you don't actually need to supply an object definition; that's optional now).

I expected this PR to break people and it did. But this is a useful break. If they were relying upon their parameters to be types other than strings they would be experiencing some unexpected behaviour. In fact, it's exactly this that prompted my PR in the first place. A colleague had defined his parameters as numbers and couldn't understand why they weren't behaving like numbers. Because they weren't numbers! And wonderfully, this will now be caught at compile time; not runtime. Yay!

Auth0, TypeScript and ASP.NET Core

Most applications I write have some need for authentication and perhaps authorisation too. In fact, most apps most people write fall into that bracket. Here's the thing: Auth done well is a *big* chunk of work. And the minute you start thinking about that you almost invariably lose focus on the thing you actually want to build and ship.

So this Christmas I decided it was time to take a look into offloading that particular problem onto someone else. I knew there were third parties who provided Auth-As-A-Service - time to give them a whirl. On the recommendation of a friend, I made Auth0 my first port of call. Lest you be expecting a full breakdown of the various players in this space, let me stop you now; I liked Auth0 so much I strayed no further. Auth0 kicks AAAS. (I'm so sorry)

What I wanted to build#

My criteria for "auth success" was this:

  • I want to build a SPA, specifically a React SPA. Ideally, I shouldn't need a back end of my own at all
  • I want to use TypeScript on my client.

But, for when I do implement a back end:

  • I want that to be able to use the client side's Auth tokens to allow access to Auth routes on my server.
  • ‎I want to able to identify the user, given the token, to provide targeted data
  • Oh, and I want to use .NET Core 2 for my server.

And in achieving all of the I want to add minimal code to my app. Not War and Peace. My code should remain focused on doing what it does.

Boil a Plate#

I ended up with unqualified ticks for all my criteria, but it took some work to find out. I will say that Auth0 do travel the extra mile in terms of getting you up and running. When you create a new Client in Auth0 you're given the option to download a quick start using the technology of your choice.

This was a massive plus for me. I took the quickstart provided and ran with it to get me to the point of meeting my own criteria. You can use this boilerplate for your own ends. Herewith, a walkthrough:

The Walkthrough#

Fork and clone the repo at this location: https://github.com/johnnyreilly/auth0-react-typescript-asp-net-core.

What have we got? 2 folders, ClientApp contains the React app, Web contains the ASP.NET Core app. Now we need to get setup with Auth0 and customise our config.

Setup Auth0#

Here's how to get the app set up with Auth0; you're going to need to sign up for a (free) Auth0 account. Then login into Auth0 and go to the management portal.

Client#

  • Create a Client with the name of your choice and use the Single Page Web Applications template.
  • From the new Client Settings page take the Domain and Client ID and update the similarly named properties in the appsettings.Development.json and appsettings.Production.json files with these settings.
  • To the Allowed Callback URLs setting add the URLs: http://localhost:3000/callback,http://localhost:5000/callback - the first of these faciliates running in Debug mode, the second in Production mode. If you were to deploy this you'd need to add other callback URLs in here too.

API#

  • Create an API with the name of your choice (I recommend the same as the Client to avoid confusion), an identifier which can be anything you like; I like to use the URL of my app but it's your call.
  • From the new API Settings page take the Identifier and update the Audience property in the appsettings.Development.json and appsettings.Production.json files with that value.

Running the App#

Production build#

Build the client app with yarn build in the ClientApp folder. (Don't forget to yarn install first.) Then, in the Web folder dotnet restore, dotnet run and open your browser to http://localhost:5000

Debugging#

Run the client app using webpack-dev-server using yarn start in the ClientApp folder. Fire up VS Code in the root of the repo and hit F5 to debug the server. Then open your browser to http://localhost:3000

The Tour#

When you fire up the app you're presented with "you are not logged in!" message and the option to login. Do it, it'll take you to the Auth0 "lock" screen where you can sign up / login. Once you do that you'll be asked to confirm access:

All this is powered by Auth0's auth0-js npm package. (Excellent type definition files are available from Definitely Typed; I'm using the @types/auth0-js package DT publishes.) Usage of which is super simple; it exposes an authorize method that when called triggers the Auth0 lock screen. Once you've "okayed" you'll be taken back to the app which will use the parseHash method to extract the access token that Auth0 has provided. Take a look at how our authStore makes use of auth0-js: (don't be scared; it uses mobx - but you could use anything)

authStore.ts#

import { Auth0UserProfile, WebAuth } from 'auth0-js';
import { action, computed, observable, runInAction } from 'mobx';
import { IAuth0Config } from '../../config';
import { StorageFacade } from '../storageFacade';
interface IStorageToken {
accessToken: string;
idToken: string;
expiresAt: number;
}
const STORAGE_TOKEN = 'storage_token';
export class AuthStore {
@observable.ref auth0: WebAuth;
@observable.ref userProfile: Auth0UserProfile;
@observable.ref token: IStorageToken;
constructor(config: IAuth0Config, private storage: StorageFacade) {
this.auth0 = new WebAuth({
domain: config.domain,
clientID: config.clientId,
redirectUri: config.redirectUri,
audience: config.audience,
responseType: 'token id_token',
scope: 'openid email profile do:admin:thing' // the do:admin:thing scope is custom and defined in the scopes section of our API in the Auth0 dashboard
});
}
initialise() {
const token = this.parseToken(this.storage.getItem(STORAGE_TOKEN));
if (token) {
this.setSession(token);
}
this.storage.addEventListener(this.onStorageChanged);
}
parseToken(tokenString: string) {
const token = JSON.parse(tokenString || '{}');
return token;
}
onStorageChanged = (event: StorageEvent) => {
if (event.key === STORAGE_TOKEN) {
this.setSession(this.parseToken(event.newValue));
}
}
@computed get isAuthenticated() {
// Check whether the current time is past the
// access token's expiry time
return this.token && new Date().getTime() < this.token.expiresAt;
}
login = () => {
this.auth0.authorize();
}
handleAuthentication = () => {
this.auth0.parseHash((err, authResult) => {
if (authResult && authResult.accessToken && authResult.idToken) {
const token = {
accessToken: authResult.accessToken,
idToken: authResult.idToken,
// Set the time that the access token will expire at
expiresAt: authResult.expiresIn * 1000 + new Date().getTime()
};
this.setSession(token);
} else if (err) {
// tslint:disable-next-line:no-console
console.log(err);
alert(`Error: ${err.error}. Check the console for further details.`);
}
});
}
@action
setSession(token: IStorageToken) {
this.token = token;
this.storage.setItem(STORAGE_TOKEN, JSON.stringify(token));
}
getAccessToken = () => {
const accessToken = this.token.accessToken;
if (!accessToken) {
throw new Error('No access token found');
}
return accessToken;
}
@action
loadProfile = async () => {
const accessToken = this.token.accessToken;
if (!accessToken) {
return;
}
this.auth0.client.userInfo(accessToken, (err, profile) => {
if (err) { throw err; }
if (profile) {
runInAction(() => this.userProfile = profile);
return profile;
}
return undefined;
});
}
@action
logout = () => {
// Clear access token and ID token from local storage
this.storage.removeItem(STORAGE_TOKEN);
this.token = null;
this.userProfile = null;
}
}

Once you're logged in the app offers you more in the way of navigation options. A "Profile" screen shows you the details your React app has retrieved from Auth0 about you. This is backed by the client.userInfo method on auth0-js. There's also a "Ping" screen which is where your React app talks to your ASP.NET Core server. The screenshot below illustrates the result of hitting the "Get Private Data" button:

The "Get Server to Retrieve Profile Data" button is interesting as it illustrates that the server can get access to your profile data as well. There's nothing insecure here; it gets the details using the access token retrieved from Auth0 by the ClientApp and passed to the server. It's the API we set up in Auth0 that is in play here. The app uses the Domain and the access token to talk to Auth0 like so:

UserController.cs#

// Retrieve the access_token claim which we saved in the OnTokenValidated event
var accessToken = User.Claims.FirstOrDefault(c => c.Type == "access_token").Value;
// If we have an access_token, then retrieve the user's information
if (!string.IsNullOrEmpty(accessToken))
{
var domain = _config["Auth0:Domain"];
var apiClient = new AuthenticationApiClient(domain);
var userInfo = await apiClient.GetUserInfoAsync(accessToken);
return Ok(userInfo);
}

We can also access the sub claim, which uniquely identifies the user:

UserController.cs#

// We're not doing anything with this, but hey! It's useful to know where the user id lives
var userId = User.Claims.FirstOrDefault(c => c.Type == System.Security.Claims.ClaimTypes.NameIdentifier).Value; // our userId is the sub value

The reason our ASP.NET Core app works with Auth0 and that we have access to the access token here in the first place is because of our startup code:

Startup.cs#

public void ConfigureServices(IServiceCollection services)
{
var domain = $"https://{Configuration["Auth0:Domain"]}/";
services.AddAuthentication(options =>
{
options.DefaultAuthenticateScheme = JwtBearerDefaults.AuthenticationScheme;
options.DefaultChallengeScheme = JwtBearerDefaults.AuthenticationScheme;
}).AddJwtBearer(options =>
{
options.Authority = domain;
options.Audience = Configuration["Auth0:Audience"];
options.Events = new JwtBearerEvents
{
OnTokenValidated = context =>
{
if (context.SecurityToken is JwtSecurityToken token)
{
if (context.Principal.Identity is ClaimsIdentity identity)
{
identity.AddClaim(new Claim("access_token", token.RawData));
}
}
return Task.FromResult(0);
}
};
});
// ....

Authorization#

We're pretty much done now; just one magic button to investigate: "Get Admin Data". If you presently try and access the admin data you'll get a 403 Forbidden. It's forbidden because that endpoint relies on the "do:admin:thing" scope in our claims:

UserController.cs#

[Authorize(Scopes.DoAdminThing)]
[HttpGet("api/userDoAdminThing")]
public IActionResult GetUserDoAdminThing()
{
return Ok("Admin endpoint");
}

Scopes.cs#

public static class Scopes
{
// the do:admin:thing scope is custom and defined in the scopes section of our API in the Auth0 dashboard
public const string DoAdminThing = "do:admin:thing";
}

This wired up in our ASP.NET Core app like so:

Startup.cs#

services.AddAuthorization(options =>
{
options.AddPolicy(Scopes.DoAdminThing, policy => policy.Requirements.Add(new HasScopeRequirement(Scopes.DoAdminThing, domain)));
});
// register the scope authorization handler
services.AddSingleton<iauthorizationhandler, hasscopehandler="">();
</iauthorizationhandler,>

HasScopeHandler.cs#

public class HasScopeHandler : AuthorizationHandler<hasscoperequirement>
{
protected override Task HandleRequirementAsync(AuthorizationHandlerContext context, HasScopeRequirement requirement)
{
// If user does not have the scope claim, get out of here
if (!context.User.HasClaim(c => c.Type == "scope" && c.Issuer == requirement.Issuer))
return Task.CompletedTask;
// Split the scopes string into an array
var scopes = context.User.FindFirst(c => c.Type == "scope" && c.Issuer == requirement.Issuer).Value.Split(' ');
// Succeed if the scope array contains the required scope
if (scopes.Any(s => s == requirement.Scope))
context.Succeed(requirement);
return Task.CompletedTask;
}
}
</hasscoperequirement>

The reason we're 403ing at present is because when our HasScopeHandler executes, requirement.Scope has the value of "do:admin:thing" and our scopes do not contain that value. To add it, go to your API in the Auth0 management console and add it:

Note that you can control how this scope is acquired using "Rules" in the Auth0 management portal.

You won't be able to access the admin endpoint yet because you're still rocking with the old access token; pre-newly-added scope. But when you next login to Auth0 you'll see a prompt like this:

Which demonstrates that you're being granted an extra scope. With your new shiny access token you can now access the oh-so-secret Admin endpoint.

I had some more questions about Auth0 as I'm still new to it myself. To see my question (and the very helpful answer!) go here: https://community.auth0.com/questions/13786/get-user-data-server-side-what-is-a-good-approach

ts-loader 2017 retrospective

2017 is drawing to a close, and it's been a big, big year in webpack-land. It's been a big year for ts-loader too. At the start of the year v1.3.3 was the latest version available, officially supporting webpack 1. (Old school!) We end the year with ts-loader sitting pretty at v3.2.0 and supporting webpack 2 and 3.

Many releases were shipped and that was down to a whole bunch of folk. People helped out with bug fixes, features, advice and docs improvements. All of these help.ts-loader wouldn't be where it is without you so thanks to everyone that helped out - you rock!

I'm really grateful to all of you. Thanks so much! (Apologies for those I've missed anyone out - I know there's more still.)

fork-ts-checker-webpack-plugin build speed improvements#

Alongside other's direct contributions to ts-loader, other projects improved the experience of using ts-loader. Piotr Oleś dropped his <a href="https://github.com/Realytics/fork-ts-checker-webpack-plugin">fork-ts-checker-webpack-plugin</a> this year which nicely increased build speed when used with ts-loader.

That opened up the possibility of adding HappyPack support. I had the good fortune to work with webpack's Tobias Koppers and ExtraHop's Alex Birmingham on improving TypeScript build speed further.

So what does the future hold?

ts-loader 4.0 (Live webpack or Die Hard)#

The web marches on and webpack gallops alongside. Here's what's in the pipeline for ts-loader in 2018:

Start using the new watch API#

A new watch API is being made available in the TypeScript API. We have a PR from the amazing Sheetal Nandi which adds support to ts-loader. Given that's quite a big PR we want to merge that before anything else lands. The watch API is still being finalised but once it lands in TypeScript we'll look to merge the PR and ship a new version of ts-loader.

Drop custom module resolution#

Historically ts-loader has had it's own module resolution mechanism in place. We're going to look to move to use the TypeScript mechanism instead. The old module resolution be deprecated but will remain available behind a flag for a time. In future we'll look to drop the old mechanism entirely.

Drop support for TypeScript 2.3 and below#

The codebase can be made simpler if we drop support for older versions of TypeScript so that's what we plan to do with our next breaking changes release.

webpack v4 is in alpha now#

If any changes need to happen to ts-loader to support webpack 4 then they will be. Personally I'm planning to help out with <a href="https://github.com/Realytics/fork-ts-checker-webpack-plugin">fork-ts-checker-webpack-plugin</a> as there will likely be some changes required there.

contextAsConfigBasePath will be replaced with a context#

The option that landed in the last month doesn't quite achieve the aims of the original PR's author Christian Tinauer. Consequently it's going to be replaced with a new option. This is queued up and ready to go here.

reportFiles option to be added#

Michel Rasschaert is presently working on adding a reportFiles option to ts-loader. You can see the PR in progress here.

Merry Christmas!#

You can expect to see the first releases of ts-loader 4.0 in 2018. In the meantime, I'd like to wish you Merry Christmas and a Happy New Year! And once more, thanks and thanks again to all you generous people who help build ts-loader. You're wonderful and so I'm glad you do what you do... joyeux Noel!

The TypeScript webpack PWA

So, there you sit, conflicted. You've got a lovely build setup; it's a thing of beauty. Precious, polished like a diamond, sharpened like a circular saw. There at the core of your carefully crafted setup sits webpack. Heaving, mysterious... powerful.

There's more. Not only are you sold on webpack, you're all in TypeScript too. But now you've heard tell of "Progressive Web Applications" and "Service Workers".... And you want to be dealt in. You want to build web apps that work offline. It can't work can it? Your build setup's going to be like the creature in the episode where they've taken one too many jumps and it's gone into the foetal position.

So this is the plan kids. Let's take a simple TypeScript, webpack setup and make it a PWA. Like Victoria Wood said...

Let's Do It Tonight#

How to begin? Well first comes the plagiarism; here's a simple TypeScript webpack setup. Rob it. Stick a gun to its head and order it onto your hard drive. yarn install to pick up your dependencies and then yarn start to see what you've got. Something like this:

Beautiful right? And if we yarn build we end up with a simple output:

To test what we've built out we want to use a simple web server to serve up the dist folder. I've got the npm package http-server installed globally for just such an eventuality. So let's http-server ./dist and I'm once again looking at our simple app; it looks exactly the same as when I yarn start. Smashing. What would we see if we were offline? Well thanks to the magic of Chrome DevTools we can find out. Offline and refresh our browser...

Not very user friendly. Once we're done, we should be able to refresh and still see our app.

Work(box) It#

Workbox is a project that makes the setting up of Service Workers (aka the magic that powers PWAs) easier. It supports webpack use cases through the workbox-webpack-plugin; so let's give it a whirl. Incidentally, there's a cracking example on the Workbox site.

yarn add workbox-webpack-plugin --dev adds the plugin to our project. To make use of it, punt your way over to the webpack.production.config.js and add an entry for the plugin. We also need to set the hash parameter of the html-webpack-plugin to be false; if it's true it'll cause problems for the ServiceWorker.

const WorkboxPlugin = require('workbox-webpack-plugin');
//...
module.exports = {
//...
plugins: [
//...
new HtmlWebpackPlugin({
hash: false,
inject: true,
template: 'src/index.html',
minify: {
removeComments: true,
collapseWhitespace: true,
removeRedundantAttributes: true,
useShortDoctype: true,
removeEmptyAttributes: true,
removeStyleLinkTypeAttributes: true,
keepClosingSlash: true,
minifyJS: true,
minifyCSS: true,
minifyURLs: true,
},
}),
new WorkboxPlugin({
// we want our service worker to cache the dist directory
globDirectory: 'dist',
// these are the sorts of files we want to cache
globPatterns: ['**/*.{html,js,css,png,svg,jpg,gif,json}'],
// this is where we want our ServiceWorker to be created
swDest: path.resolve('dist', 'sw.js'),
// these options encourage the ServiceWorkers to get in there fast
// and not allow any straggling "old" SWs to hang around
clientsClaim: true,
skipWaiting: true,
}),
]
//...
};

With this in place, yarn build will generate a ServiceWorker. Now to alter our code to register it. Open up index.tsx and add this to the end of the file:

if ('serviceWorker' in navigator) {
window.addEventListener('load', () => {
navigator.serviceWorker.register('/sw.js').then(registration => {
// tslint:disable:no-console
console.log('SW registered: ', registration);
}).catch(registrationError => {
console.log('SW registration failed: ', registrationError);
});
});
}

Put it together and...

What Have We Got?#

Let's yarn build again.

Oooohh look! A service worker is with us. Does it work? Let's find out... http-server ./dist Browse to http://localhost:8080 and let's have a look at the console.

Looks very exciting. So now the test; let's go offline and refresh:

You are looking at the 200s of success. You're now running with webpack and TypeScript and you have built a Progressive Web Application. Feel good about life.

TypeScript Definitions, webpack and Module Types

A funny thing happened on the way to the registry the other day. Something changed in an npm package I was using and confusion arose. You can read my unfiltered confusion here but here's the slightly clearer explanation.

The TL;DR#

When modules are imported, your loader will decide which module format it wants to use. CommonJS / AMD etc. The loader decides. It's important that the export is of the same "shape" regardless of the module format. For 2 reasons:

  1. You want to be able to reliably use the module regardless of the choice that your loader has made for which export to use.
  2. Because when it comes to writing type definition files for modules, there is support for a single external definition. Not one for each module format.

The DR#

Once upon a time we decided to use big.js in our project. It's popular and my old friend Steve Ognibene apparently originally wrote the type definitions which can be found here. Then the definitions were updated by Miika Hänninen. And then there was pain.

UMD / CommonJS **and** Global exports oh my!#

My usage code was as simple as this:

import * as BigJs from 'big.js';
const lookABigJs = new BigJs(1);

If you execute it in a browser it works. It makes me a Big. However the TypeScript compiler is **not** happy. No siree. Nope. It's bellowing at me:

[ts] Cannot use 'new' with an expression whose type lacks a call or construct signature.

So I think: "Huh! I guess Miika just missed something off when he updated the definition files. No bother. I'll fix it." I take a look at how big.js exposes itself to the outside world. At the time, thusly:

//AMD.
if (typeof define === 'function' && define.amd) {
define(function () {
return Big;
});
// Node and other CommonJS-like environments that support module.exports.
} else if (typeof module !== 'undefined' && module.exports) {
module.exports = Big;
module.exports.Big = Big;
//Browser.
} else {
global.Big = Big;
}

Now, we were using webpack as our script bundler / loader. webpack is supersmart; it can take all kinds of module formats. So although it's more famous for supporting CommonJS, it can roll with AMD. That's exactly what's happening here. When webpack encounters the above code, it goes with the AMD export. So at runtime, import * as BigJs from 'big.js'; lands up resolving to the return Big; above.

Now this turns out to be super-relevant. I took a look at the relevant portion of the definition file and found this:

export const Big: BigConstructor;

Which tells me that Big is being exported as a subproperty of the module. That makes sense; that lines up with the module.exports.Big = Big; statement in the the big.js source code. There's a "gotcha" coming; can you guess what it is?

The problem is that our type definition is not exposing Big as a default export. So even though it's there; TypeScript won't let us use it. What's killing us further is that webpack is loading the AMD export which doesn't have Big as a subproperty of the module. It only has it as a default.

Kitson Kelly expressed the problem well when he said:

there is a different shape depending on which loader is being used and I am not sure that makes a huge amount of sense. The AMD shape is different than the CommonJS shape. While that is technically possible, that feels like that is an issue.

One Definition to Rule Them All#

He's right; it is an issue. From a TypeScript perspective there is no way to write a definition file that allows for different module "shapes" depending upon the module type. If you really wanted to do that you're reduced to writing multiple definition files. That's blind alley anyway; what you want is a module to expose itself with the same "shape" regardless of the module type. What you want is this:

AMD === CommonJS === Global

And that's what we now have! Thanks to Michael McLaughlin, author of big.js, version 4.0 unified the export shape of the package. Miika Hänninen submitted another PR which fixed up the type definitions. And once again the world is a beautiful place!

TypeScript + Webpack: Super Pursuit Mode

This post also featured as a webpack Medium publication.

If you're like me then you'll like TypeScript and you'll like module bundling with webpack. You may also like speedy builds. That's completely understandable. The fact of the matter is, you sacrifice a bit of build speed to have webpack in the mix. Wouldn't it be great if we could even up the difference?

I'm the primary maintainer of ts-loader, a TypeScript loader for webpack. Just recently a couple of PRs were submitted that said, in other words: ts-loader is like this:

But it could be like this:

Apologies for the image quality above; there appear to be no high quality pictures out there of KITT in Super Pursuit Mode for me to defame with Garan Jenkin's atrocious puns.

fork-ts-checker-webpack-plugin#

"Faster type checking with forked process" read the enticing name of the issue. It turned out to be Piotr Oleś (@OlesDev) telling the world about his beautiful creation. He'd put together a mighty fine plugin that can be used alongside ts-loader called the fork-ts-checker-webpack-plugin. The name is a bit of a mouthful but the purpose is mouth-watering. To quote the README, it is a:

Webpack plugin that runs typescript type checker on a separate process.

What does this mean and how does this fit with ts-loader? Well, ts-loader does 2 jobs:

  1. It transpiles your TypeScript into JavaScript and hands it off to webpack
  2. It collects any TypeScript compilation errors and reports them to webpack

What this plugin does is say, "forget about #2 - we've got this." It removes the responsibility for type checking from ts-loader, so the only work ts-loader does is transpilation. In the meantime, the all important type checking is still happening. To be honest, there would be little reason to recommend this approach otherwise. The difference is fork-ts-checker-webpack-plugin is doing the heavy lifting in a separate process. This provides a nice performance boost to your workflow. ts-loader is doing less and that's a good thing

.

The approach used here is similar to that employed by awesome-typescript-loader. ATL is another TypeScript loader for webpack by the excellent Stanislav Panferov. ATL also has a technique for performing typechecking in a forked process. fork-ts-checker-webpack-plugin was an effort by Piotr to implement something similar but with improved incremental build performance.

How do we use it? Add fork-ts-checker-webpack-plugin as a devDependency of your project and then amend the webpack.config.js to set ts-loader into transpileOnly mode and drop the plugin into the mix:

var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');
var webpackConfig = {
// other config...
context: __dirname, // to automatically find tsconfig.json
module: {
rules: [
{
test: /\.tsx?$/,
loader: 'ts-loader',
options: {
// disable type checker - we will use it in fork plugin
transpileOnly: true
}
}
]
},
plugins: [
new ForkTsCheckerWebpackPlugin()
]
};

If you'd like to see an example of how to use the plugin then take a look at a simple example and a more involved one.

HappyPack#

Not so long ago I didn't know what happyness

HappyPack was. "Happiness in the form of faster webpack build times." That's what it is.

HappyPack makes webpack builds faster by allowing you to transform multiple files in parallel.

It does this by spinning up multiple threads, each with their own loaders inside. We wanted to do this with ts-loader; to have multiple instances of ts-loader running. Work can then be divided up across these separate loaders. Isn't multi-threading great?

ts-loader did not initially play nicely with HappyPack; essentially this is because ts-loader touches parts of webpack's API that HappyPack replaces. The entirely wonderful Artem Kozlov submitted a PR which added HappyPack support to ts-loader. Support essentially amounts to switching ts-loader to run in transpileOnly mode and ensuring that there is no attempt to talk to parts of the webpack API that HappyPack removes.

It would be hard to recommend using HappyPack as is because, as with transpileOnly mode you lose all typechecking. Where it becomes worthwhile is where it is combined with the fork-ts-checker-webpack-plugin so you keep the typechecking.

Enough with the chitter chatter; how can we achieve this? Add HappyPack as a devDependency of your project and then amend the webpack.config.js as follows:

var HappyPack = require('happypack');
var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');
module.exports = {
// other config...
context: __dirname, // to automatically find tsconfig.json
module: {
rules: [
{
test: /\.tsx?$/,
exclude: /node_modules/,
loader: 'happypack/loader?id=ts'
}
]
},
plugins: [
new HappyPack({
id: 'ts',
threads: 2,
loaders: [
{
path: 'ts-loader',
query: { happyPackMode: true }
}
]
}),
new ForkTsCheckerWebpackPlugin({ checkSyntacticErrors: true })
]
};

Note that the ts-loader options are now configured via the HappyPack query and that we're setting ts-loader with the happyPackMode option set.

There's one other thing to note which is important; we're now passing the checkSyntacticErrors option to the fork plugin. This ensures that the plugin checks for both syntactic errors (eg const array = [{} {}];) and semantic errors (eg const x: number = '1';). By default the plugin only checks for semantic errors. This is because when ts-loader is used with transpileOnly set, ts-loader will still report syntactic errors. But when used in happyPackMode it does not.

If you'd like to see an example of how to use HappyPack then once again we have a simple example and a more involved one.

thread-loader + cache-loader#

You might have some reservations about using HappyPack. First of all the quirky configuration required makes your webpack config rather less comprehensible. Also, HappyPack is not officially blessed by webpack. It is a side project developed externally from webpack and there's no guarantees that new versions of webpack won't break it. Neither of these are reasons not to use HappyPack but they are things to bear in mind.

What if there were a way to parallelise our builds which dealt with these issues? Well, there is! By using thread-loader and cache-loader in combination you can both feel happy that you're using an official webpack workflow and you can have a config that's less confusing.

What would that config look like? This:

var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');
module.exports = {
// other config...
context: __dirname, // to automatically find tsconfig.json
module: {
rules: {
test: /\.tsx?$/,
use: [
{ loader: 'cache-loader' },
{
loader: 'thread-loader',
options: {
// there should be 1 cpu for the fork-ts-checker-webpack-plugin
workers: require('os').cpus().length - 1,
},
},
{
loader: 'ts-loader',
options: {
happyPackMode: true // IMPORTANT! use happyPackMode mode to speed-up compilation and reduce errors reported to webpack
}
}
]
}
},
plugins: [
new ForkTsCheckerWebpackPlugin({ checkSyntacticErrors: true })
]
};

As you can see the configuration is much cleaner than with HappyPack. Interestingly ts-loader still needs to run in "happyPackMode" and that's because thread-loader is essentially behaving in the same fashion as with HappyPack and so ts-loader needs to behave in the same way. Probably ts-loader should have a more generic flag name than "happyPackMode". (Famously, naming things is hard; so if you've a good idea, tell me!)

These loaders are new and so tread carefully. My own experiences have been pretty positive but your mileage may vary. Do note that, as with HappyPack, the thread-loader is highly configurable.

If you'd like to see an example of how to use thread-loader and cache-loader then once again we have a simple example and a more involved one.

All This Could Be Yours...#

Wow! It looks like we can cut our build time by 4 minutes! #webpack@typescriptlang // cc @johnny_reillypic.twitter.com/gjvy9SLBAT

— Donald Pipowitch (@PipoPeperoni) June 23, 2017

In this post we're improving build speeds with TypeScript and webpack in 3 ways:

fork-ts-checker-webpack-plugin
With this plugin in play ts-loader only performs transpilation. ts-loader is doing less so the build is faster.
HappyPack
With HappyPack in the mix, the build is parallelised. That parallelisation means the build is faster.
thread-loader / cache-loader
With thread-loader and cache-loader, again the build is parallelised and the build is faster.

Dynamic import: I've been awaiting you...

One of the most exciting features to ship with TypeScript 2.4 was support for the dynamic import expression. To quote the release blog post:

Dynamic import expressions are a new feature in ECMAScript that allows you to asynchronously request a module at any arbitrary point in your program. These modules come back as Promises of the module itself, and can be await-ed in an async function, or can be given a callback with .then.

...

Many bundlers have support for automatically splitting output bundles (a.k.a. “code splitting”) based on these import() expressions, so consider using this new feature with the esnext module target. Note that this feature won’t work with the es2015 module target, since the feature is anticipated for ES2018 or later.

As the post makes clear, this adds support for a very bleeding edge ECMAScript feature. This is not fully standardised yet; it's currently at stage 3 on the TC39 proposals list. That means it's at the Candidate stage and is unlikely to change further. If you'd like to read more about it then take a look at the official proposal here.

Whilst this is super-new, we are still able to use this feature. We just have to jump through a few hoops first.

TypeScript Setup#

First of all, you need to install TypeScript 2.4. With that in place you need to make some adjustments to your tsconfig.json in order that the relevant compiler switches are flipped. What do you need? First of all you need to be targeting ECMAScript 2015 as a minimum. That's important specifically because ES2015 contained Promises which is what dynamic imports produce. The second thing you need is to target the module type of esnext. You're likely targeting es2015 now, esnext is that plus dynamic imports.

Here's a tsconfig.json I made earlier which has the relevant settings set:

{
"compilerOptions": {
"allowSyntheticDefaultImports": true,
"lib": [
"dom",
"es2015"
],
"target": "es2015",
"module": "esnext",
"moduleResolution": "node",
"noImplicitAny": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"removeComments": false,
"preserveConstEnums": true,
"sourceMap": true,
"skipLibCheck": true
}
}

Babel Setup#

At the time of writing, browser support for dynamic import is non-existent. This will likely be the case for some time but it needn't hold us back. Babel can step in here and compile our super-new JS into JS that will run in our browsers today.

You'll need to decide for yourself how much you want Babel to do for you. In my case I'm targeting old school browsers which don't yet support ES2015. You may not need to. However, the one thing that you'll certainly need is the Syntax Dynamic Import plugin. It's this that allows Babel to process dynamic import statements.

These are the options I'm passing to Babel:

var babelOptions = {
"plugins": ["syntax-dynamic-import"],
"presets": [
[
"es2015",
{
"modules": false
}
]
]
};

You're also going to need something that actually execute the imports. In my case I'm using webpack...

webpack#

webpack 2 supports import(). So if you webpack set up with ts-loader (or awesome-typescript-loader etc), chaining into babel-loader you should find you have a setup that supports dynamic import. That means a webpack.config.js that looks something like this:

var path = require('path');
var webpack = require('webpack');
var babelOptions = {
"plugins": ["syntax-dynamic-import"],
"presets": [
[
"es2015",
{
"modules": false
}
]
]
};
module.exports = {
entry: './app.ts',
output: {
filename: 'bundle.js'
},
module: {
rules: [{
test: /\.ts(x?)$/,
exclude: /node_modules/,
use: [
{
loader: 'babel-loader',
options: babelOptions
},
{
loader: 'ts-loader'
}
]
}, {
test: /\.js$/,
exclude: /node_modules/,
use: [
{
loader: 'babel-loader',
options: babelOptions
}
]
}]
},
resolve: {
extensions: ['.ts', '.tsx', '.js']
},
};

ts-loader example#

I'm one of the maintainers of ts-loader which is a TypeScript loader for webpack. When support for dynamic imports landed I wanted to add a test to cover usage of the new syntax with ts-loader.

We have 2 test packs for ts-loader, one of which is our "execution" test pack. It is so named because it works by spinning up webpack with ts-loader and then using karma to execute a set of tests. Each "test" in our execution test pack is actually a mini-project with its own test suite (generally jasmine but that's entirely configurabe). Each complete with its own webpack.config.js, karma.conf.js and either a typings.json or package.json for bringing in dependencies. So it's a full test of whether code slung with ts-loader and webpack actually executes when the output is plugged into a browser.

This is the test pack for dynamic imports:

import a from "../src/a";
import b from "../src/b";
describe("app", () => {
it("a to be 'a' and b to be 'b' (classic)", () => {
expect(a).toBe("a");
expect(b).toBe("b");
});
it("import results in a module with a default export", done => {
import("../src/c").then(c => {
// .default is the default export
expect(c.default).toBe("c");
done();
}
});
it("import results in a module with an export", done => {
import("../src/d").then(d => {
// .default is the default export
expect(d.d).toBe("d");
done();
}
});
it("await import results in a module with a default export", async done => {
const c = await import("../src/c");
// .default is the default export
expect(c.default).toBe("c");
done();
});
it("await import results in a module with an export", async done => {
const d = await import("../src/d");
expect(d.d).toBe("d");
done();
});
});

As you can see, it's possible to use the dynamic import as a Promise directly. Alternatively, it's possible to consume the imported module using TypeScripts support for async / await. For my money the latter option makes for much clearer code.

If you're looking for a complete example of how to use the new syntax then you could do worse than taking the existing test pack and tweaking it to your own ends. The only change you'd need to make is to strip out the resolveLoader statements in webpack.config.js and karma.conf.js. (They exist to lock the test in case to the freshly built ts-loader stored locally. You'll not need this.)

You can find the test in question here. Happy code splitting!

TypeScript: Spare the Rod, Spoil the Code

I've recently started a new role. Perhaps unsurprisingly, part of the technology stack is TypeScript. A couple of days into the new codebase I found a bug. Well, I say I found a bug, TypeScript and VS Code found the bug - I just let everyone else know.

The flexibility that TypeScript offers in terms of compiler settings is second to none. You can turn up the dial of strictness to your hearts content. Or down. I'm an "up" man myself.

The project that I am working on has the dial set fairly low; it's pretty much using the default compiler values which are (sensibly) not too strict. I have to say this makes sense for helping people get on board with using TypeScript. Start from a point of low strictness and turn it up when you're ready. As you might have guessed, I cranked the dial up on day one on my own machine. I should say that as I did this, I didn't foist this on the project at large - I kept it just to my build... I'm not *that* guy!

I made the below changes to the tsconfig.json file. Details of what each of these settings does can be found in the documentation here.

"noImplicitAny": true,
"noImplicitThis": true,
"noUnusedLocals": true,
"noImplicitReturns": true,
"noUnusedParameters": true,

I said I found a bug. The nature of the bug was an unused variable; a variable was created in a function but then not used. Here's a super simple example:

function sayHi(name: string) {
const greeting = `Hi ${ name }`;
return name;
}

It's an easy mistake to make. I've made this mistake before myself. But with the noUnusedLocals compiler setting in place it's now an easy mistake to catch; VS Code lets you know loud and clear:

The other compiler settings will similarly highlight simple mistakes it's possible to make and I'd recommend using them. I should say I've written this from the perspective of a VS Code user, but this really applies generally to TypeScript usage. So whether you're an alm.tools guy, a WebStorm gal or something else entirely then this too can be yours!

I'd also say that the strictNullChecks compiler setting is worth looking into. However, switching an already established project to using that can involve fairly extensive code changes and will also require a certain amount of education of, and buy in from, your team. So whilst I'd recommend it too, I'd save that one until last.

But you can't die... I love you!

That's how I was feeling on the morning of October 6th 2016. I'd been feeling that way for some time. The target of my concern? ts-loader. ts-loader is a loader for webpack; the module bundler. ts-loader allows you use TypeScript with webpack. I'd been a merry user of it for at least a year or so. But, at that point, all was not well in the land of ts-loader. Come with me and I'll tell you a story...

Going Red#

At some point, I became a member of the TypeStrong organisation on GitHub. I'm honestly not entirely sure how. I think it may have been down to the very excellent Basarat (he of ALM / atom-typescript / the list goes on fame) but I couldn't clearly say.

Either way, James Brantly's ts-loader was also one of TypeStrong's projects. Since I used it, I occasionally contributed. Not much to be honest; mostly it was documentation tweaks. I mean I never really looked at the main code at all. It worked (thanks to other people). I just plugged it into my projects and ploughed on my merry way. I liked it. It was well established; with friendly maintainers. It had a continuous integration test pack that ran against multiple versions of TypeScript on both Windows and Linux. I trusted it. Then one day the continuous integration tests went red. And stayed red.

This is where we came in. On the morning of October 6th I was mulling what to do about this. I knew there was another alternative out there (awesome-typescript-loader) but I was a little wary of it. My understanding of ATL was that it targeted webpack 2.0 which has long been in beta. Where I ply my trade (mostly developing software for the financial sector in the City of London) beta is not a word that people trust. They don't do beta. What's more I was quite happy with ts-loader; I didn't want to switch if I didn't have to. I also rather suspected (rightly) that there wasn't much wrong; ts-loader just needed a little bit of love. So I thought: I bet I can help here.

The Statement of Intent#

So that evening I raised an issue against ts-loader. Not a "sort it out chap" issue. No. That wouldn't be terribly helpful. I raised a "here's how I can help" issue. I present an abridged version below:

Okay here's the deal; I've been using ts-loader for a long time but my contributions up until now have mostly been documentation. Fixing of tests etc. As the commit history shows this is @jbrantly's baby and kudos to him.

He's not been able to contribute much of late and since he's the main person who's worked on ts-loader not much has happened for a while; the code is a bit stale. As I'm a member of TypeStrong I'm going to have a go at improving the state of the project. I'm going to do this as carefully as I can. This issue is intended as a meta issue to make it visible what I'm plannning to do / doing.

My immediate goal is to get a newer version of ts-loader built and shipped. Essentially all the bug fixes / tweaks since the last release should ship.

...

I don't have npm publish rights for ts-loader. Fortunately both @jbrantly and @blakeembrey do - and hopefully one of them will either be able to help out with a publish or let me have the requisite rights to do it.

I can't promise this is all going to work; I've got a limited amount of spare time I'm afraid. Whatever happens it's going to take me a little while. But I'm going to see where I can take this. Best foot forward! Please bear with me...

I did wonder what would happen next. This happened next:

My #opensourceguilt has been lifted thanks to @johnny_reilly stepping up to take over ts-loader. Thanks man!

— James Brantly (@jbrantly) October 11, 2016

Caretaker, not BDFL#

So that's how it came to pass that I became the present main caretaker of ts-loader. James very kindly gave me the rights to publish to npm and soon enough I did. I fixed up the existing integration test pack; made it less brittle. I wrote a new integration test pack (that performs a different sort of testing; execution rather than comparison). I merged pull requests, I closed issues. I introduced a regression (whoops!), a community member helped me fix it (thanks Mike Mazmanyan!). In the last month ts-loader has shipped 6 times.

The thing that matters most in the last paragraph are the phrases "I merged pull requests" and "a community member helped me fix it". I'm wary of one man bands; you should be to. I want projects to be a thing communally built and maintained. If I go under a bus I want someone else to be able to carry on without me. So be part of this; I want you to help!

I've got plans to do a lot more. I'm in the process of refactoring ts-loader to make it more modular and hence easier for others to contribute. (Also it must be said, refactoring something is an excellent way to try and learn a codebase.) Version 1.0 of ts-loader should ship this week.

I'm working with Herrington Darkholme (awesome name BTW!) to add a hook-in point that will allow ts-loader to support vuejs. Stuff is happening and will continue to. But don't be shy; be part of this! ts-loader awaits your PRs and is happy to have as many caretakers as possible!

TypeScript 2.0, ES2016 and Babel

TypeScript 2.0 has shipped! Naturally I'm excited. For some time I've been using TypeScript to emit ES2015 code which I pass onto Babel to transpile to ES "old school". You can see how here.

Merely upgrading my package.json to use "typescript": "^2.0.3" from "typescript": "^1.8.10" was painless. TypeScript now supports ES2016 (the previous major release 1.8 supported ES2015). I wanted to move on from writing ES2015 to writing ES2016 using my chosen build process. Fortunately, it's supported. Phew. However, due to some advances in ecmascript feature modularisation within the TypeScript compiler the upgrade path is slightly different. I figured that I'd just be able to update the <a href="https://www.typescriptlang.org/docs/handbook/compiler-options.html">target</a> in my tsconfig.json to "es2016" from "es2015", add in the ES2016 preset for Babel and jobs a good 'un. Not so. There were a few more steps to follow. Here's the recipe:

tsconfig.json changes#

Well, there's no "es2016" target for TypeScript. You carry on with a target of "es2015". What I need is a new entry: "lib": ["dom", "es2015", "es2016"]. This tells the compiler that we're expecting to be emitting to an environment which supports a browser ("dom"), and both ES2016 and ES2015. Our "environment" is Babel and it's going to pick up the baton from this point. My complete tsconfig.json looks like this:

{
"compileOnSave": false,
"compilerOptions": {
"allowSyntheticDefaultImports": true,
"lib": ["dom", "es2015", "es2016"],
"jsx": "preserve",
"module": "es2015",
"moduleResolution": "node",
"noEmitOnError": false,
"noImplicitAny": true,
"preserveConstEnums": true,
"removeComments": false,
"suppressImplicitAnyIndexErrors": true,
"target": "es2015"
}
}

Babel changes#

I needed the Babel preset for ES2016; with a quick <a href="https://www.npmjs.com/package/babel-preset-es2016">npm install --save-dev babel-preset-es2016</a> that was sorted. Now just to kick Webpack into gear...

Webpack changes#

My webpack config plugs together TypeScript and Babel with the help of ts-loader and babel-loader. It allows the transpilation of my (few) JavaScript files so I can write ES2015. However, mainly it allows the transpilation of my (many) TypeScript files so I can write ES2015-flavoured TypeScript. I'll now tweak the loaders so they cater for ES2016 as well.

var webpack = require('webpack');
module.exports = {
// ....
module: {
loaders: [{
// Now transpiling ES2016 TS
test: /\.ts(x?)$/,
exclude: /node_modules/,
loader: 'babel-loader?presets[]=es2016&presets[]=es2015&presets[]=react!ts-loader'
}, {
// Now transpiling ES2016 JS
test: /\.js$/,
exclude: /node_modules/,
loader: 'babel',
query: {
presets: ['es2016', 'es2015', 'react']
}
}]
},
// ....
};

Wake Up and Smell the Jasmine#

And we're there; it works. How do I know? Well; here's the proof:

it("Array.prototype.includes works", () => {
const result = [1, 2, 3].includes(2);
expect(result).toBe(true);
});
it("Exponentiation operator works", () => {
expect(1 ** 2 === Math.pow(1, 2)).toBe(true);
});

Much love to the TypeScript team for an awesome job; I can't wait to get stuck into some of the exciting new features of TypeScript 2.0. strictNullChecks FTW!

The Ternary Operator <3 Destructuring

I'm addicted to the ternary operator. For reasons I can't explain, I cannot get enough of:

const thisOrThat = (someCondition) ? "this" : "or that"

The occasion regularly arises where I need to turn my lovely terse code into an if statement in order to set 2 variables instead of 1. I've been heartbroken; I hate doing:

let legWear: string, coat: boolean;
if (weather === "good") {
legWear = "shorts";
coat = false;
}
else {
legWear = "jeans";
coat = true;
}

Just going from setting one variable to setting two has been really traumatic:

  • I've had do stop using const and moved to let. This has made my code less "truthful" in the sense that I never intend to reassign these variables again; they are intended to be immutable.
  • I've gone from 1 line of code to 9 lines of code. That's 9x the code for increasing the number of variables in play by 1. That's... heavy.
  • This third point only applies if you're using TypeScript (and I am): I have to specify the types of my variables up front if I want type safety.

ES2015 gives us another option. We can move back to the ternary operator if we change the return type of each branch to be an object sharing the same signature. Then, using destructuring, we can pull out those object properties into consts:

const { legWear, coat } = (weather === "good")
? { legWear: "shorts", coat: false }
: { legWear: "jeans", coat: true }

With this approach we're keeping usage of const instead of let and we're only marginally increasing the amount of code we're writing. If you're using TypeScript you're back to being able to rely on the compiler correctly inferring your types; you don't need to specify. Awesome.

Crowdfund You A Tuple#

I thought I was done and then I saw this:

@johnny_reilly even neater with tuples: const [str, num] = test ? ["yes", 100] : ["no", 50];

— Illustrated Pamphlet (@Rickenhacker) August 20, 2016

Daniel helpfully points out that there's an even terser syntax available to us:

const [ legWear, coat ] = (weather === "good")
? [ "shorts", false ]
: [ "jeans", true ]

The above is ES2015 array destructuring. We get exactly the same effect but it's a little terser as we don't have to repeat the prop names as we do when using object destructuring. From a TypeScript perspective the assignment side of the above is a Tuple which allows our type inference to flow through in the manner we'd hope.

Lovely. Thanks!

Understanding Webpack's DefinePlugin (and using with TypeScript)

I've been searching for a way to describe what the DefinePlugin actually does. The docs say*:

Define free variables. Useful for having development builds with debug logging or adding global constants.

* Actually that should read "used to say". I've made some changes to the official docs.... (Surprisingly easy to do that by the way; it's just a wiki you can edit at will.)

I think I would describe it thusly: the DefinePlugin allows you to create global constants which can be configured at compile time. I find this very useful for allowing different behaviour between development builds and release builds. This post will demonstrate usage of this approach, talk about what's actually happening and how to get this working nicely with TypeScript.

If you just want to see this in action then take a look at this repo and keep your eyes open for usage of __VERSION__ and __IN_DEBUG__.

What Globals?#

For our example we want to define 2 global constants; a string called __VERSION__ and a boolean called __IN_DEBUG__. The names are deliberately wacky to draw attention to the fact that these are not your everyday, common-or-garden variables. Them's "special". These constants will be initialised with different values depending on whether we are in a debug build or a production build. Usage of these constants in our code might look like this:

if (__IN_DEBUG__) {
console.log(`This app is version ${ __VERSION__ }`);
}

So, if __IN_DEBUG__ is set to true this code would log out to the console the version of the app.

Configuring our Globals#

To introduce these constants to webpack we're going to add this to our webpack configuration:

var webpack = require('webpack');
// ...
plugins: [
new webpack.DefinePlugin({
__IN_DEBUG__: JSON.stringify(false),
__VERSION__: JSON.stringify('1.0.0.' + Date.now())
}),
// ...
]
// ...

What's going on here? Well, each key of the object literal above represents one of our global constants. When you look at the value, just imagine each outer JSON.stringify( ... ) is not there. It's just noise. Imagine instead that you're seeing this:

__IN_DEBUG__: false,
__VERSION__: '1.0.0.' + Date.now()

A little clearer, right? __IN_DEBUG__ is given the boolean value false and __VERSION__ is given the string value of 1.0.0. plus the ticks off of Date.now(). What's happening here is well explained in Pete Hunt's excellent webpack howto: "definePlugin takes raw strings and inserts them". JSON.stringify facilitates this; it produces a string representation of a value that can be inlined into code. When the inlining takes place the actual output would be something like this:

if (false) { // Because at compile time, __IN_DEBUG__ === false
console.log(`This app is version ${ "1.0.0.1469268116580" }`); // And __VERSION__ === "1.0.0.1469268116580"
}

And if you've got some UglifyJS or similar in the mix then, in the example above, this would actually strip out the statement above entirely since it's clearly a NOOP. Yay the dead code removal! If __IN_DEBUG__ was false then (perhaps obviously) this statement would be left in place as it wouldn't be dead code.

TypeScript and Define#

The final piece of the puzzle is making TypeScript happy. It doesn't know anything about our global constants. So we need to tell it:

declare var __IN_DEBUG__: boolean;
declare var __VERSION__: string;

And that's it. Compile time constants are a go!

Creating an ES2015 Map from an Array in TypeScript

I'm a great lover of ES2015's <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map">Map</a>. However, just recently I tumbled over something I find a touch inconvenient about how you initialise a new Map from the contents of an Array in TypeScript.

This Doesn't Work#

We're going try to something like this: (pilfered from the MDN docs)

var kvArray = [["key1", "value1"], ["key2", "value2"]];
// Use the regular Map constructor to transform a 2D key-value Array into a map
var myMap = new Map(kvArray);

Simple enough right? Well I'd rather assumed that I should be able to do something like this in TypeScript:

const iAmAnArray [
{ value: "value1", text: "hello" }
{ value: "value2", text: "map" }
];
const iAmAMap = new Map<string, string>(
iAmAnArray.map(x => [x.value, x.text])
);

However, to my surprise this errored out with:

[ts] Argument of type 'string[][]' is not assignable to parameter of type 'Iterable<[string, string]>'.
Types of property '[Symbol.iterator]' are incompatible.
Type '() => IterableIterator<string[]>' is not assignable to type '() => Iterator<[string, string]>'.
Type 'IterableIterator<string[]>' is not assignable to type 'Iterator<[string, string]>'.
Types of property 'next' are incompatible.
Type '(value?: any) => IteratorResult<string[]>' is not assignable to type '(value?: any) => IteratorResult<[string, string]>'.
Type 'IteratorResult<string[]>' is not assignable to type 'IteratorResult<[string, string]>'.
Type 'string[]' is not assignable to type '[string, string]'.
Property '0' is missing in type 'string[]'.

Disappointing right? It's expecting Iterable&lt;[string, string]&gt; and an Array with 2 elements that are strings is not inferred to be that.

This Does#

It emerges that there is a way to do this though; you just need to give the compiler a clue. You need to include a type assertion of as [string, string] which tells the compiler that what you've just declared is a Tuple of string and string. (Please note that [string, string] corresponds to the types of the Key and Value of your Map and should be set accordingly.)

So a working version of the code looks like this:

const iAmAnArray [
{ value: "value1", text: "hello" }
{ value: "value2", text: "map" }
];
const iAmAMap = new Map<string, string>(
iAmAnArray.map(x => [x.value, x.text] as [string, string])
);

Or, to be terser, this:

const iAmAnArray [
{ value: "value1", text: "hello" }
{ value: "value2", text: "map" }
];
const iAmAMap = new Map( // Look Ma! No type annotations
iAmAnArray.map(x => [x.value, x.text] as [string, string])
);

I've raised this as an issue with the TypeScript team; you can find details here.

Visual Studio, tsconfig.json and external TypeScript compilation

TypeScript first gained support for tsconfig.json back with the 1.5 release. However, to my lasting regret and surprise Visual Studio will not be gaining meaningful support for it until TypeScript 1.8 ships. However, if you want it now, it's already available to use in beta.

I've already leapt aboard. Whilst there's a number of bugs in the beta it's still totally usable. So use it.

External TypeScript Compilation and the VS build#

Whilst tsconfig.json is useful and super cool it has limitations. It allows you to deactivate compilation upon file saving using compileOnSave. What it doesn't allow is deactivation of the TypeScript compilation that happens as part of a Visual Studio build. That may not matter for the vanilla workflow of just dropping TypeScript files in a Visual Studio web project and having VS invoke the TypeScript compilation. However it comes to matter when your workflow deviates from the norm, as mine does. Using external compilation of TypeScript within Visual Studio is a little tricky. My own use case is somewhat atypical but perhaps not uncommon.

I'm working on a project which has been built using TypeScript since TS 0.9. Not surprisingly, this started off using the default Visual Studio / TypeScript workflow. Active development on the project ceased around 9 months ago. Now it's starting up again. It's a reasonable sized web app and the existing functionality is, in the main, fine. But the users want to add some new screens.

Like any developer, I want to build with the latest and greatest. In my case, this means I want to write modular ES6 using TypeScript. With this approach my code can be leaner and I have less script ordering drama in my life. (Yay import statements!) This can be done by bringing together webpack, TypeScript (ts-loader) and Babel (babel-loader). There's an example of this approach here. Given the size of the existing codebase I'd rather leave the legacy TypeScript as is and isolate my new approach to the screens I'm going to build. Obviously I'd like to have a common build process for all the codebase at some point but I've got a deadline to meet and so a half-old / half-new approach is called for (at least for the time being).

Goodbye TypeScript Compilation in VS#

Writing modular ES6 TypeScript which is fully transpiled to old-school JS is not possible using the Visual Studio tooling at present. For what it's worth I think that SystemJS compilation may make this more possible in the future but I don't really know enough about it to be sure. That's why I'm bringing webpack / Babel into the mix right now. I don't want Visual Studio to do anything for the ES6 code; I don't want it to compile. I want to deactivate my TypeScript compilation for the ES6 code. I can't do this from the tsconfig.json so I'm in a bit of a hole. What to do?

Well, as of (I think) TypeScript 1.7 it's possible to deactivate TypeScript compilation in Visual Studio. To quote:

there is an easier way to disable TypeScriptCompile:

Just add &lt;TypeScriptCompileBlocked&gt;true&lt;/TypeScriptCompileBlocked&gt; to the .csproj, e.g. in the first &lt;PropertyGroup&gt;.

Awesomeness!

But wait, this means that the legacy TypeScript isn't being compiled any longer. Bear in mind, I'm totally happy with the existing / legacy TypeScript compilation. Nooooooooooooooo!!!!!!!!!!!!!!!

Hello TypeScript Compilation outside VS#

Have no fear, I gotcha. What we're going to do is ensure that Visual Studio triggers 2 external TypeScript builds as part of its own build process:

  • The modular ES6 TypeScript (new)
  • The legacy TypeScript (old)

How do we do this? Through the magic of build targets. We need to add this to our .csproj: (I add it near the end; I'm not sure if location matters though)

<PropertyGroup>
<CompileDependsOn>
$(CompileDependsOn);
WebClientBuild;
</CompileDependsOn>
<CleanDependsOn>
$(CleanDependsOn);
WebClientClean
</CleanDependsOn>
<CopyAllFilesToSingleFolderForPackageDependsOn>
CollectGulpOutput;
CollectLegacyTypeScriptOutput;
$(CopyAllFilesToSingleFolderForPackageDependsOn);
</CopyAllFilesToSingleFolderForPackageDependsOn>
<CopyAllFilesToSingleFolderForMsdeployDependsOn>
CollectGulpOutput;
CollectLegacyTypeScriptOutput;
$(CopyAllFilesToSingleFolderForPackageDependsOn);
</CopyAllFilesToSingleFolderForMsdeployDependsOn>
</PropertyGroup>
<Target Name="WebClientBuild">
<Exec Command="npm install" />
<Exec Command="npm run build-legacy-typescript" />
<Exec Command="npm run build -- --mode $(ConfigurationName)" />
</Target>
<Target Name="WebClientClean">
<Exec Command="npm run clean" />
</Target>
<Target Name="CollectGulpOutput">
<ItemGroup>
<_CustomFiles Include="dist\**\*" />
<FilesForPackagingFromProject Include="%(_CustomFiles.Identity)">
<DestinationRelativePath>dist\%(RecursiveDir)%(Filename)%(Extension)</DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
<Message Text="CollectGulpOutput list: %(_CustomFiles.Identity)" />
</Target>
<Target Name="CollectLegacyTypeScriptOutput">
<ItemGroup>
<_CustomFiles Include="Scripts\**\*.js" />
<FilesForPackagingFromProject Include="%(_CustomFiles.Identity)">
<DestinationRelativePath>Scripts\%(RecursiveDir)%(Filename)%(Extension)</DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
<Message Text="CollectLegacyTypeScriptOutput list: %(_CustomFiles.Identity)" />
</Target>

There's a few things going on here; let's take them one by one.

The WebClientBuild Target#

This target triggers our external builds. One by one it runs the following commands:

npm install
Installs the npm packages.
npm run build-legacy-typescript
Runs the "build-legacy-typescript"script in our package.json
npm run build -- --mode $(ConfigurationName)
Runs the "build"script in our package.json and passes through a mode parameter of either "Debug" or "Release" from MSBuild - indicating whether we're creating a debug or a release build.

As you've no doubt gathered, I'm following the convention of using the scripts element of my package.json as repository for the various build tasks I might have for a web project. It looks like this:

{
// ...
"scripts": {
"test": "karma start --reporters mocha,junit --single-run --browsers PhantomJS",
"build-legacy-typescript": "tsc -v&&tsc --project Scripts",
"clean": "gulp delete-dist-contents",
"watch": "gulp watch",
"build": "gulp build"
},
// ...
}

As you can see, "build-legacy-typescript" invokes tsc (which is registered as a devDependency) to print out the version of the compiler. Then it invokes tsc again using the project flag directly on the Scripts directory. This is where the legacy TypeScript and its associated tsconfig.json resides. Et voilá, the old / existing TypeScript is compiled just as it was previously by VS itself.

Next, the "build" invokes a gulp task called, descriptively, "build". This task caters for our brand new codebase of modular ES6 TypeScript. When run, this task will invoke webpack, copy static files, build less etc. Quick digression: you can see there's a "watch" script that does the same thing on a file-watching basis; I use that during development.

The WebClientClean Target#

The task that runs to clean up artefacts created by WebClientBuild.

The CollectLegacyTypeScriptOutput and CollectGulpOutput Targets#

Since we're compiling our TypeScript outside of VS we need to tell MSBuild / MSDeploy about the externally compiled assets in order that they are included in the publish pipeline. Here I'm standing on the shoulders of Steve Cadwallader's excellent post. Thanks Steve!

CollectLegacyTypeScriptOutput and CollectGulpOutput respectively include all the built files contained in the "Scripts" and "dist" folders when a publish takes place. You don't need this for when you're building on your own machine but if you're looking to publish (either from your machine or from TFS) then you will need exactly this. Believe me that last sentence was typed with a memory of great pain and frustration.

So in the end, as far as TypeScript is concerned, I'm using Visual Studio solely as an editor. It's the hooks in the .csproj that ensure that compilation happens. It seems a little quirky that we still need to have the original TypeScript targets in the .csproj file as well; but it works. That's all that matters.

Definitely Typed Shouldn't Exist

OK - the title's total clickbait but stay with me; there's a point here.

I'm a member of the Definitely Typed team - and hopefully I won't be kicked out for writing this. My point is this: .d.ts files should live with the package they provide typing information for, in npm / GitHub etc. Not separately. TypeScript 1.6 has just been released. Yay! In the release blog post it says this:

We’ve changed module resolution when doing CommonJS output to work more closely to how Node does module resolution. If a module name is non-relative, we now follow these steps to find the associated typings:

  1. Check in node_modules for &lt;module name&gt;.d.ts
  2. Search node_modules\&lt;module name&gt;\package.json for a typings field
  3. Look for node_modules\&lt;module name&gt;\index.d.ts
  4. Then we go one level higher and repeat the process

Please note: when we search through node_modules, we assume these are the packaged node modules which have type information and a corresponding .js file. As such, we resolve only .d.ts files (not .ts file) for non-relative names.

Previously, we treated all module names as relative paths, and therefore we would never properly look in node_modules... We will continue to improve module resolution, including improvements to AMD, in upcoming releases.

The TL;DR is this: consuming npm packages which come with definition files should JUST WORK™... npm is now a first class citizen in TypeScriptLand. So everyone who has a package on npm should now feel duty bound to include a .d.ts when they publish and Definitely Typed can shut up shop. Simple right?

Wrong!#

Yeah, it's never going to happen. Surprising as it is, there are many people who are quite happy without TypeScript in their lives (I know - mad right?). These poor unfortunates are unlikely to ever take the extra steps necessary to write definition files. For this reason, there will probably always be a need for a provider of typings such as Definitely Typed. As well as that, the vast majority of people using TypeScript probably don't use npm to manage dependencies. There are, however, an increasing number of users who are using npm. Some (like me) may even be using tools like Browserify (with the TSIFY plugin) or WebPack (with the TS loader) to bring it all together. My feeling is that, over time, using npm will become more common; particularly given the improvements being made to module resolution in the language.

An advantage of shipping typings with an npm package is this: those typings should accurately describe their accompanying package. In Definitely Typed we only aim to support the latest and greatest typings. So if you find yourself looking for the typings of an older version of a package you're going to have to pick your way back through the history of a .d.ts file and hope you happen upon the version you're looking for. Not a fantastic experience.

So I guess what I'm saying is this: if you're an npm package author then it would be fantastic to start shipping a package with typings in the box. If you're using npm to consume packages then using Definitely Typed ought to be the second step you might take after installing a package; the step you only need to take if the package doesn't come with typings. Using DT should be a fallback, not a default.

Authoring npm modules with TypeScript#

Yup - that's what this post is actually about. See how I lured you in with my mild trolling and pulled the old switcheroo? That's edutainment my friend. So, how do we write npm packages in TypeScript and publish them with their typings? Apparently Gandhi didn't actually say "Be the change you wish to see in the world." Which is a shame. But anyway, I'm going to try and embrace the sentiment here.

Not so long ago I wrote a small npm module called globalize-so-what-cha-want. It is used to determine what parts of Globalize 1.x you need depending on the modules you're planning to use. It also, contains a little demo UI / online tool written in React which powers this.

For this post, the purpose of the package is rather irrelevant. And even though I've just told you about it, I want you to pretend that the online tool doesn't exist. Pretend I never mentioned it.

What is relevant, and what I want you to think about, is this: I wrote globalize-so-what-cha-want in plain old, honest to goodness JavaScript. Old school.

But, my love of static typing could be held in abeyance for only so long. Once the initial package was written, unit tested and published I got the itch. THIS SHOULD BE WRITTEN IN TYPESCRIPT!!! Well, it didn't have to be but I wanted it to be. Despite having used TypeScript since the early days I'd only been using it for front end work; not for writing npm packages. My mission was clear: port globalize-so-what-cha-want to TypeScript and re-publish to npm.

Port, port, port!!!#

At this point globalize-so-what-cha-want consisted of a single index.js file in the root of the package. My end goal was to end up with that file still sat there, but now generated from TypeScript. Alongside it I wanted to see a index.d.ts which was generated from the same TypeScript.

index.jsbefore looked like this:

/* jshint varstmt: false, esnext: false */
var DEPENDENCY_TYPES = {
SHARED_JSON: 'Shared JSON (used by all locales)',
LOCALE_JSON: 'Locale specific JSON (supplied for each locale)'
};
var moduleDependencies = {
'core': {
dependsUpon: [],
cldrGlobalizeFiles: ['cldr.js', 'cldr/event.js', 'cldr/supplemental.js', 'globalize.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/likelySubtags.json' }
]
},
'currency': {
dependsUpon: ['number','plural'],
cldrGlobalizeFiles: ['globalize/currency.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/currencies.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/currencyData.json' }
]
},
'date': {
dependsUpon: ['number'],
cldrGlobalizeFiles: ['globalize/date.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/ca-gregorian.json' },
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/timeZoneNames.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/timeData.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/weekData.json' }
]
},
'message': {
dependsUpon: ['core'],
cldrGlobalizeFiles: ['globalize/message.js'],
json: []
},
'number': {
dependsUpon: ['core'],
cldrGlobalizeFiles: ['globalize/number.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/numbers.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/numberingSystems.json' }
]
},
'plural': {
dependsUpon: ['core'],
cldrGlobalizeFiles: ['globalize/plural.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/plurals.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/ordinals.json' }
]
},
'relativeTime': {
dependsUpon: ['number','plural'],
cldrGlobalizeFiles: ['globalize/relative-time.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/dateFields.json' }
]
}
};
function determineRequiredCldrData(globalizeOptions) {
return determineRequired(globalizeOptions, _populateDependencyCurrier('json', function(json) { return json.dependency; }));
}
function determineRequiredCldrGlobalizeFiles(globalizeOptions) {
return determineRequired(globalizeOptions, _populateDependencyCurrier('cldrGlobalizeFiles', function(cldrGlobalizeFile) { return cldrGlobalizeFile; }));
}
function determineRequired(globalizeOptions, populateDependencies) {
var modules = Object.keys(globalizeOptions);
modules.forEach(function(module) {
if (!moduleDependencies[module]) {
throw new TypeError('There is no \'' + module + '\' module');
}
});
var requireds = [];
modules.forEach(function (module) {
if (globalizeOptions[module]) {
populateDependencies(module, requireds);
}
});
return requireds;
}
function _populateDependencyCurrier(requiredArray, requiredArrayGetter) {
var popDepFn = function(module, requireds) {
var dependencies = moduleDependencies[module];
dependencies.dependsUpon.forEach(function(requiredModule) {
popDepFn(requiredModule, requireds);
});
dependencies[requiredArray].forEach(function(required) {
var newRequired = requiredArrayGetter(required);
if (requireds.indexOf(newRequired) === -1) {
requireds.push(newRequired);
}
});
return requireds;
};
return popDepFn;
}
module.exports = {
determineRequiredCldrData: determineRequiredCldrData,
determineRequiredCldrGlobalizeFiles: determineRequiredCldrGlobalizeFiles
};

You can even kind of tell that it was written in JavaScript thanks to the jshint rules at the top.

I fired up Atom and created a new folder src/lib and inside there I created index.ts (yes, index.js renamed) and tsconfig.json. By the way, you'll notice I'm not leaving Atom - I'm making use of the magnificent atom-typescript which you should totally be using too. It rocks.

Now I'm not going to bore you with what I had to do to port the JS to TS (not much). If you're interested, the source is here. What's more interesting is the tsconfig.json - as it's this that is going to lead the generation of the JS and TS that we need:

{
"compileOnSave": true,
"compilerOptions": {
"module": "commonjs",
"declaration": true,
"target": "es5",
"noImplicitAny": true,
"suppressImplicitAnyIndexErrors": true,
"removeComments": false,
"preserveConstEnums": true,
"sourceMap": false,
"outDir": "../../"
},
"files": [
"index.ts"
]
}

The things to notice are:

module
Publishing a commonjs module means it will play well with npm
declaration
This is what makes TypeScript generate index.d.ts
outDir
We want to regenerate the index.js in the root (2 directories above this)

So now, what do we get when we build in Atom? Well, we're generating an <a href="https://github.com/johnnyreilly/globalize-so-what-cha-want/blob/master/index.js">index.js</a> file which looks like this:

var DEPENDENCY_TYPES = {
SHARED_JSON: 'Shared JSON (used by all locales)',
LOCALE_JSON: 'Locale specific JSON (supplied for each locale)'
};
var moduleDependencies = {
'core': {
dependsUpon: [],
cldrGlobalizeFiles: ['cldr.js', 'cldr/event.js', 'cldr/supplemental.js', 'globalize.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/likelySubtags.json' }
]
},
'currency': {
dependsUpon: ['number', 'plural'],
cldrGlobalizeFiles: ['globalize/currency.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/currencies.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/currencyData.json' }
]
},
'date': {
dependsUpon: ['number'],
cldrGlobalizeFiles: ['globalize/date.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/ca-gregorian.json' },
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/timeZoneNames.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/timeData.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/weekData.json' }
]
},
'message': {
dependsUpon: ['core'],
cldrGlobalizeFiles: ['globalize/message.js'],
json: []
},
'number': {
dependsUpon: ['core'],
cldrGlobalizeFiles: ['globalize/number.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/numbers.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/numberingSystems.json' }
]
},
'plural': {
dependsUpon: ['core'],
cldrGlobalizeFiles: ['globalize/plural.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/plurals.json' },
{ dependencyType: DEPENDENCY_TYPES.SHARED_JSON, dependency: 'cldr/supplemental/ordinals.json' }
]
},
'relativeTime': {
dependsUpon: ['number', 'plural'],
cldrGlobalizeFiles: ['globalize/relative-time.js'],
json: [
{ dependencyType: DEPENDENCY_TYPES.LOCALE_JSON, dependency: 'cldr/main/{locale}/dateFields.json' }
]
}
};
function determineRequired(globalizeOptions, populateDependencies) {
var modules = Object.keys(globalizeOptions);
modules.forEach(function (module) {
if (!moduleDependencies[module]) {
throw new TypeError('There is no \'' + module + '\' module');
}
});
var requireds = [];
modules.forEach(function (module) {
if (globalizeOptions[module]) {
populateDependencies(module, requireds);
}
});
return requireds;
}
function _populateDependencyCurrier(requiredArray, requiredArrayGetter) {
var popDepFn = function (module, requireds) {
var dependencies = moduleDependencies[module];
dependencies.dependsUpon.forEach(function (requiredModule) {
popDepFn(requiredModule, requireds);
});
dependencies[requiredArray].forEach(function (required) {
var newRequired = requiredArrayGetter(required);
if (requireds.indexOf(newRequired) === -1) {
requireds.push(newRequired);
}
});
return requireds;
};
return popDepFn;
}
/**
* The string array returned will contain a list of the required cldr json data you need. I don't believe ordering matters for the json but it is listed in the same dependency order as the cldr / globalize files are.
*
* @param options The globalize modules being used.
*/
function determineRequiredCldrData(globalizeOptions) {
return determineRequired(globalizeOptions, _populateDependencyCurrier('json', function (json) { return json.dependency; }));
}
exports.determineRequiredCldrData = determineRequiredCldrData;
/**
* The string array returned will contain a list of the required cldr / globalize files you need, listed in the order they are required.
*
* @param options The globalize modules being used.
*/
function determineRequiredCldrGlobalizeFiles(globalizeOptions) {
return determineRequired(globalizeOptions, _populateDependencyCurrier('cldrGlobalizeFiles', function (cldrGlobalizeFile) { return cldrGlobalizeFile; }));
}
exports.determineRequiredCldrGlobalizeFiles = determineRequiredCldrGlobalizeFiles;

Aside from one method moving internally and me adding some JSDoc, the only really notable change is the end of the file. TypeScript, when generating commonjs, doesn't use the module.exports = {} approach. Rather, it drops exported functions onto the exports object as functions are exported. Functionally this is identical.

Now for our big finish: happily sat alongside is index.js is the <a href="https://github.com/johnnyreilly/globalize-so-what-cha-want/blob/master/index.d.ts">index.d.ts</a> file:

export interface Options {
currency?: boolean;
date?: boolean;
message?: boolean;
number?: boolean;
plural?: boolean;
relativeTime?: boolean;
}
/**
* The string array returned will contain a list of the required cldr json data you need. I don't believe ordering matters for the json but it is listed in the same dependency order as the cldr / globalize files are.
*
* @param options The globalize modules being used.
*/
export declare function determineRequiredCldrData(globalizeOptions: Options): string[];
/**
* The string array returned will contain a list of the required cldr / globalize files you need, listed in the order they are required.
*
* @param options The globalize modules being used.
*/
export declare function determineRequiredCldrGlobalizeFiles(globalizeOptions: Options): string[];

We're there, huzzah! This has been now published to npm - anyone consuming this package can use TypeScript straight out of the box. I really hope that publishing npm packages in this fashion becomes much more commonplace. Time will tell.

PS I'm not the only one#

I was just about to hit "publish" when I happened upon Basarat's ts-npm-module which is a project on GitHub which demo's how to publish and consume TypeScript using npm. I'd say great minds think alike but I'm pretty sure Basarat's mind is far greater than mine! (Cough, atom-typescript, cough.) Either way, it's good to see validation for the approach I'm suggesting.

PPS Update 23/09/2015 09:51#

One of the useful things about writing a blog is that you get to learn. Since I published I've become aware of a few things somewhat relevant to this post. First of all, there is still work ongoing in TypeScript land around this topic. Essentially there are problems resolving dependency conflicts when different dependencies have different versions - you can take part in the ongoing discussion here. There's also some useful resources to look at:

Hey tsconfig.json, where have you been all my life?

Sometimes, you just miss things. Something seismic happens and you had no idea. So it was with tsconfig.json.

This blog post started life with the name "TypeScript: Some IDEs are more equal than others". I'd intended to use it summarise a discussion on the TypeScript GitHub repo about implicit referencing including a fist shaken at the sky at the injustice of it all. But whilst I was writing it I dicovered things had changed without my knowledge. That's a rather wonderful thing.

Implicit Referencing#

Implicit referencing, if you're not aware, is the thing that separates Visual Studio from all other IDEs / text editors. Implicit referencing means that in Visual Studio you don't need to make use of comments at the head of each TypeScript file in order to tell the compiler where it can find the related TypeScript files.

The reference comments aren't necessary when using Visual Studio because the VS project file is used to drive the files passed to the TypeScript compiler (tsc).

The upshot of this is that, at time of writing, you can generally look at a TypeScript codebase and tell whether it was written using Visual Studio by opening it up a file at random and eyeballing for something like this at the top:

/// <reference path="other-file.ts" />

"A-ha! They're using "reference" comments Watson. From this I deduce that the individuals in question are using the internal module approach and using Visual Studio as their IDE. Elementary, my dear fellow, quite elementary."

This has important implications. Important I tell you, yes important! Well, important if you want to reduce the barriers between Visual Studio and everyone else. And I do. Whilst I love Visual Studio - it's been my daily workhorse for many years - I also love stepping away from it and using something more stripped down. I also like working with other people without mandating that they need to use Visual Studio as well. In the words of Rodney King, "can't we all get along?".

Cross-IDE TypeScript projects#

I feel I should be clear - you can already set up TypeScript projects to work regardless of IDE. But there's friction. It's not clear cut. You can see a full on discussion around this here but in the end it comes down to making a choice between these 3 options:

  1. Set
    false
    in a project file. This flag effectively deactivates implicit referencing. This approach requires that all developers (regardless of IDE) use /// &lt;references to build context. Compiler options in VS can be controlled using the project file as is.
  2. Using Visual Studio without any csproj tweaks. This approach requires that all files will need /// &lt;references at their heads in order to build compilation context outside of Visual Studio. It's possible that /// &lt;references and the csproj could get out of line - care is required to avoid this. Compiler options in VS can be controlled using the project file as is.
  3. Using just files in Visual Studio with /// &lt;references to build compilation context. This scenario also requires that all developers (regardless of IDE) use /// &lt;references to build context. In Visual Studio there will be no control over compiler options.

As you can see - this is sub-optimal. But don't worry - there's a new sheriff in town....

tsconfig.json#

I'd decided to give Atom TypeScript plugin a go as I heard much enthusiastic noise about it. I fired it up and pointed it at a a TypeScript AngularJS project built in Visual Studio. I was mentally preparing myself for the job of adding all the /// references in when I suddenly noticed a file blinking at me:

tsconfig.json? What's that? Time to read the docs:

Supported via tsconfig.json (read more) which is going to be the defacto Project file format for the next versions of TypeScript.

"read more"? Oh yes indeedy - I think I will "read more"!

A unified project format for TypeScript (see merged PR on Microsoft/TypeScript). The TypeScript compiler (1.4 and above) only cares about compilerOptions and files. We add additional features to this with the typescript team's approval to extend the file as long as we don't conflict:

  • compilerOptions similar to what you would pass on the commandline to tsc.
  • filesGlob: To make it easier for you to just add / remove files in your project we add filesGlob which accepts an array of glob / minimatch / RegExp patterns (similar to grunt)to specify source files.
  • format: Code formatting options
  • version: The TypeScript version

That's right folks, we don't need /// &lt;references comments anymore. In a blinding flash of light it all changes. We're going from the dark end of the street, to the bright side of the road. tsconfig.json is here to ease away the pain and make it all better. Let's enjoy it while we can.

This change should ship with TypeScript 1.5 (hopefully) for those using Visual Studio. For those using Atom TypeScript (and as of today that's includes me) the carnival celebrations can begin now!

Thanks to @basarat who have quoted at length and Daniel Earwicker who is the reason that I came to discover tsconfig.json.

TypeScript: In Praise of Union Types

(& How to Express Functions in UTs)#

Have you heard the good news my friend? I refer, of course, to the shipping of TypeScript 1.4 and my favourite language feature since generics.... Union Types.

In the 1.4 announcement Jonathan Turner described Union Types thusly:

JavaScript functions may take a number of possible argument types. Up to now, we’ve supported this using function overloads. Starting with TypeScript 1.4, we’ve generalized this capability and now allow you to specify that that a value is one of a number of different types using a union type:

function f(x: number | number[]) {
if (typeof x === "number") {
return x + 10;
}
else {
// return sum of numbers
}
}

Once you have a value of a union type, you can use a typeof and instanceof checks to use the value in a type-safe way. You'll notice we use this in the above example and can treat x as a number type inside of the if-block.

Union types are a new kind of type and work any place you specify a type.

Lovely right? But what's missing? Well, to my mind, the most helpful aspect of Union Types. Definition file creation.

A little history#

That's right - the days before Union Types are now "history" :-)#

When creating definition files (*.d.ts) in the past there was a problem with TypeScript. A limitation. JavaScript often relies on "option bags" to pass configuration into a method. An "option bag" is essentially a JavaScript object literal which contains properties which are used to perform configuration. A good example of this is the route parameter passed into Angular's ngRoute <a href="https://docs.angularjs.org/api/ngRoute/provider/$routeProvider#when">when</a> method.

I'd like to draw your attention to 2 of the properties that can be passed in (quoted from the documentation):

  • controller – {(string|function()=} – Controller fn that should be associated with newly created scope or the name of a registered controller if passed as a string.

  • template – {string=|function()=} – html template as a string or a function that returns an html template as a string which should be used by ngView or ngInclude directives. This property takes precedence over templateUrl.

    If template is a function, it will be called with the following parameters:

    {Array.&lt;Object&gt;} - route parameters extracted from the current $location.path() by applying the current route

Both of these properties can be of more than 1 type.

  • controller can be a stringor a function.
  • template can be a stringor a function that returns a string and has $routeParams as a parameter.

There's the rub. Whilst it was possible to overload functions in TypeScript pre 1.4, it was not

possible to overload interface members. This meant the only way to model these sorts of properties was by seeking out a best common type which would fit all scenarios. This invariably meant using the any type. Whilst that worked it didn't lend any consuming code a great deal of type safety. Let's look at a truncated version of <a href="https://github.com/borisyankov/DefinitelyTyped/blob/c71628e0765eb8e240d8eabd2225f64ea2e2fdb8/angularjs/angular-route.d.ts">angular-route.d.ts</a> for these properties prior to union types:

declare module ng.route {
// ...
interface IRoute {
/**
* {(string|function()=}
* Controller fn that should be associated with newly created scope or
* the name of a registered controller if passed as a string.
*/
controller?: any;
/**
* {string=|function()=}
* Html template as a string or a function that returns an html template
* as a string which should be used by ngView or ngInclude directives. This
* property takes precedence over templateUrl.
*
* If template is a function, it will be called with the following parameters:
*
* {Array.<Object>} - route parameters extracted from the current
* $location.path() by applying the current route
*/
template?: any;
// ...
}
// ...
}

It's any city... Kind of sticks in the craw doesn't it?

A new dawn#

TypeScript 1.4 has shipped and Union Types are with us. We can do better than any. So what does <a href="https://github.com/borisyankov/DefinitelyTyped/blob/30ce45e0e706322f34608ab6fa5de141bba59c90/angularjs/angular-route.d.ts">angular-route.d.ts</a> look like now we have Union Types?

declare module ng.route {
// ...
interface IRoute {
/**
* {(string|function()=}
* Controller fn that should be associated with newly created scope or
* the name of a registered controller if passed as a string.
*/
controller?: string|Function;
/**
* {string=|function()=}
* Html template as a string or a function that returns an html template
* as a string which should be used by ngView or ngInclude directives. This
* property takes precedence over templateUrl.
*
* If template is a function, it will be called with the following parameters:
*
* {Array.<Object>} - route parameters extracted from the current
* $location.path() by applying the current route
*/
template?: string | { ($routeParams?: ng.route.IRouteParamsService) : string; }
// ...
}
// ...
}

With these changes in place we are now accurately modelling the route option bags in TypeScript. Hoorah!!!

Let's dig in a little. If you look at the controller definition it's pretty straightforward. string|Function - clearly the controller can be a stringor a Function. Simple.

Now let's look at the template definition by itself:

template?: string | { ($routeParams?: ng.route.IRouteParamsService) : string; }

As with the controller the template can be a string - that is pretty clear. But what's that hovering on the other side of the "|"? What could { ($routeParams?: ng.route.IRouteParamsService) : string; } be exactly?

Well, in a word, it's a Function. The controller would allow any kind of function at all. However the template definition is deliberately more restrictive. This defines a function which must return a string and which receives an optional parameter of $routeParams of type ng.route.IRouteParamsService.

State of the Union#

Hopefully you can now see just how useful Union Types are and how you can express specific sorts of function definitions as part of a Union Type.

The thing that prompted me first to write this post was seeing that there don't appear to be any examples out there of how to express functions inside Union Types. I only landed on the syntax myself after a little experimentation in Visual Studio after I'd installed TS 1.4. I've started work on bringing Union Types to the typings inside DefinitelyTyped and so you'll start to see them appearing more and more. But since it's rather "hidden knowledge" at present I wanted to do my bit to make it a little better known.

As Daniel helpfully points out in the comments there is an alternate syntax - lambda style. So instead of this:

template?: string | { ($routeParams?: ng.route.IRouteParamsService) : string; }

You could write this:

template?: string | (($routeParams?: ng.route.IRouteParamsService) => string);

Just remember to place parentheses around the lambda to clearly delineate it.

Bonfire of the Overloads#

Before I sign off I should mention the ability Union Types give you to define a much terser definition file. Basically the "|" operator makes for a bonfire of the overloads. Where you previously may have had 6 overloads for the same method (each with identical JSDoc) you now only need 1. Which is beautiful (and DRY).

It's surprising just what a difference it makes. This is <a href="https://github.com/borisyankov/DefinitelyTyped/blob/9bd7fe69d98337db56144c3da131d413f5b7e895/jquery/jquery.d.ts">jQuery.d.ts</a> last week (pre TypeScript 1.4). This is <a href="https://github.com/borisyankov/DefinitelyTyped/blob/9f64372a065541fe2b8f6c5c5cd9b55a1d631f19/jquery/jquery.d.ts">jQuery.d.ts</a> now - with Union Types aplenty. Last week it was ~4000 lines of code. This week it's ~3200 lines of code. With the same functionality. Union Types are FANTASTIC!

Using Gulp in Visual Studio instead of Web Optimization

Update 17/02/2015: I've taken the approach discussed in this post a little further - you can see here#

I've used a number of tools to package up JavaScript and CSS in my web apps. Andrew Davey's tremendous Cassette has been really useful. Also good (although less powerful/magical) has been Microsoft's very own Microsoft.AspNet.Web.Optimization that ships with MVC.

I was watching the ASP.NET Community Standup from October 7th, 2014 and learned that the ASP.Net team is not planning to migrate Microsoft.AspNet.Web.Optimization to the next version of ASP.Net. Instead they're looking to make use of JavaScript task runners like Grunt and maybe Gulp. Perhaps you're even dimly aware that they've been taking steps to make these runners more of a first class citizen in Visual Studio, hence the recent release of the new and groovy Task Runner Explorer.

Gulp has been on my radar for a while now as has Grunt. By "on my radar" what I really mean is "Hmmmm, I really need to learn this..... perhaps I could wait until the Betamax vs VHS battles are done? Oh never mind, here we go...".

My understanding is that Grunt and Gulp essentially do the same thing (run tasks in JavaScript) but have different approaches. Grunt is more about configuration, Gulp is more about code. At present Gulp also has a performance advantage as it does less IO than Grunt - though I understand that's due to change in the future. But generally my preference is code over configuration. On that basis I decided that I was going to give Gulp first crack.

Bub bye Web Optimization#

I already had a project that used Web Optimization to bundle JavaScript and CSS files. When debugging on my own machine Web Optimization served up the full JavaScript and CSS files. Thanks to the magic of source maps I was able to debug the TypeScript that created the JavaScript files too. Which was nice. When I deployed to production, Web Optimization minified and concatenated the JavaScript and CSS files. This meant I had a single HTTP request for JavaScript and a single HTTP request for CSS. This was also... nooice!

I took a copy of my existing project and created a new repo for it on GitHub. It was very simple in terms of bundling. It had a BundleConfig that created 2 bundles; 1 for JavaScript and 1 for CSS:

using System.Web;
using System.Web.Optimization;
namespace Proverb.Web
{
public class BundleConfig
{
// For more information on bundling, visit http://go.microsoft.com/fwlink/?LinkId=301862
public static void RegisterBundles(BundleCollection bundles)
{
var angularApp = new ScriptBundle("~/angularApp").Include(
// Vendor Scripts
"~/scripts/jquery-{version}.js",
"~/scripts/angular.js",
"~/scripts/angular-animate.js",
"~/scripts/angular-route.js",
"~/scripts/angular-sanitize.js",
"~/scripts/angular-ui/ui-bootstrap-tpls.js",
"~/scripts/toastr.js",
"~/scripts/moment.js",
"~/scripts/spin.js",
"~/scripts/underscore.js",
// Bootstrapping
"~/app/app.js",
"~/app/config.route.js",
// common Modules
"~/app/common/common.js",
"~/app/common/logger.js",
"~/app/common/spinner.js",
// common.bootstrap Modules
"~/app/common/bootstrap/bootstrap.dialog.js"
);
// directives
angularApp.IncludeDirectory("~/app/directives", "*.js", true);
// services
angularApp.IncludeDirectory("~/app/services", "*.js", true);
// controllers
angularApp.IncludeDirectory("~/app/admin", "*.js", true);
angularApp.IncludeDirectory("~/app/about", "*.js", true);
angularApp.IncludeDirectory("~/app/dashboard", "*.js", true);
angularApp.IncludeDirectory("~/app/layout", "*.js", true);
angularApp.IncludeDirectory("~/app/sayings", "*.js", true);
angularApp.IncludeDirectory("~/app/sages", "*.js", true);
bundles.Add(angularApp);
bundles.Add(new StyleBundle("~/Content/css").Include(
"~/content/ie10mobile.css",
"~/content/bootstrap.css",
"~/content/font-awesome.css",
"~/content/toastr.css",
"~/content/styles.css"
));
}
}
}

I set myself a task. I wanted to be able to work in *exactly* the way I was working now. But using Gulp instead of Web Optimization. I wanted to lose the BundleConfig above and remove Web Optimization from my application, secure in the knowledge that I had lost nothing. Could it be done? Read on!

Installing Gulp (and Associates)#

I fired up Visual Studio and looked for an excuse to use the Task Runner Explorer. The first thing I needed was Gulp. My machine already had Node and NPM installed so I went to the command line to install Gulp globally:

npm install gulp -g

Now to start to plug Gulp into my web project. It was time to make the introductions: Visual Studio meet NPM. At the root of the web project I created a package.json file by executing the following command and accepting all the defaults:

npm init

I wanted to add Gulp as a development dependency of my project: ("Development" because I only need to run tasks at development time. My app has no dependency on Gulp at runtime - at that point it's just about serving up static files.)

npm install gulp --save-dev

This installs gulp local to the project as a development dependency. As a result we now have a "node_modules" folder sat in our root which contains our node packages. Currently, as our package.json reveals, this is only gulp:

"devDependencies": {
"gulp": "^3.8.8"
}

It's time to go to town. Let's install all the packages we're going to need to bundle and minify JavaScript and CSS:

npm install gulp-concat gulp-uglify gulp-rev del path gulp-ignore gulp-asset-manifest gulp-minify-css --save-dev

This installs the packages as dev dependencies (as you've probably guessed) and leaves us with a list of dev dependencies like this:

"devDependencies": {
"del": "^0.1.3",
"gulp": "^3.8.8",
"gulp-asset-manifest": "0.0.5",
"gulp-concat": "^2.4.1",
"gulp-ignore": "^1.2.1",
"gulp-minify-css": "^0.3.10",
"gulp-rev": "^1.1.0",
"gulp-uglify": "^1.0.1",
"path": "^0.4.9"
}

Making gulpfile.js#

So now I was ready. I had everything I needed to replace my BundleConfig.cs. I created a new file called gulpfile.js in the root of my web project that looked like this:

/// <vs AfterBuild='default' />
var gulp = require("gulp");
// Include Our Plugins
var concat = require("gulp-concat");
var ignore = require("gulp-ignore");
var manifest = require("gulp-asset-manifest");
var minifyCss = require("gulp-minify-css");
var uglify = require("gulp-uglify");
var rev = require("gulp-rev");
var del = require("del");
var path = require("path");
var tsjsmapjsSuffix = ".{ts,js.map,js}";
var excludetsjsmap = "**/*.{ts,js.map}";
var bundleNames = { scripts: "scripts", styles: "styles" };
var filesAndFolders = {
base: ".",
buildBaseFolder: "./build/",
debug: "debug",
release: "release",
css: "css",
// The fonts we want Gulp to process
fonts: ["./fonts/*.*"],
// The scripts we want Gulp to process - adapted from BundleConfig
scripts: [
// Vendor Scripts
"./scripts/angular.js",
"./scripts/angular-animate.js",
"./scripts/angular-route.js",
"./scripts/angular-sanitize.js",
"./scripts/angular-ui/ui-bootstrap-tpls.js",
"./scripts/toastr.js",
"./scripts/moment.js",
"./scripts/spin.js",
"./scripts/underscore.js",
// Bootstrapping
"./app/app" + tsjsmapjsSuffix,
"./app/config.route" + tsjsmapjsSuffix,
// common Modules
"./app/common/common" + tsjsmapjsSuffix,
"./app/common/logger" + tsjsmapjsSuffix,
"./app/common/spinner" + tsjsmapjsSuffix,
// common.bootstrap Modules
"./app/common/bootstrap/bootstrap.dialog" + tsjsmapjsSuffix,
// directives
"./app/directives/**/*" + tsjsmapjsSuffix,
// services
"./app/services/**/*" + tsjsmapjsSuffix,
// controllers
"./app/about/**/*" + tsjsmapjsSuffix,
"./app/admin/**/*" + tsjsmapjsSuffix,
"./app/dashboard/**/*" + tsjsmapjsSuffix,
"./app/layout/**/*" + tsjsmapjsSuffix,
"./app/sages/**/*" + tsjsmapjsSuffix,
"./app/sayings/**/*" + tsjsmapjsSuffix
],
// The styles we want Gulp to process - adapted from BundleConfig
styles: [
"./content/ie10mobile.css",
"./content/bootstrap.css",
"./content/font-awesome.css",
"./content/toastr.css",
"./content/styles.css"
]
};
filesAndFolders.debugFolder = filesAndFolders.buildBaseFolder + "/" + filesAndFolders.debug + "/";
filesAndFolders.releaseFolder = filesAndFolders.buildBaseFolder + "/" + filesAndFolders.release + "/";
/**
* Create a manifest depending upon the supplied arguments
*
* @param {string} manifestName
* @param {string} bundleName
* @param {boolean} includeRelativePath
* @param {string} pathPrepend
*/
function getManifest(manifestName, bundleName, includeRelativePath, pathPrepend) {
// Determine filename ("./build/manifest-debug.json" or "./build/manifest-release.json"
var manifestFile = filesAndFolders.buildBaseFolder + "manifest-" + manifestName + ".json";
return manifest({
bundleName: bundleName,
includeRelativePath: includeRelativePath,
manifestFile: manifestFile,
log: true,
pathPrepend: pathPrepend,
pathSeparator: "/"
});
}
// Delete the build folder
gulp.task("clean", function (cb) {
del([filesAndFolders.buildBaseFolder], cb);
});
// Copy across all files in filesAndFolders.scripts to build/debug
gulp.task("scripts-debug", ["clean"], function () {
return gulp
.src(filesAndFolders.scripts, { base: filesAndFolders.base })
.pipe(gulp.dest(filesAndFolders.debugFolder));
});
// Create a manifest.json for the debug build - this should have lots of script files in
gulp.task("manifest-scripts-debug", ["scripts-debug"], function () {
return gulp
.src(filesAndFolders.scripts, { base: filesAndFolders.base })
.pipe(ignore.exclude("**/*.{ts,js.map}")) // Exclude ts and js.map files from the manifest (as they won't become script tags)
.pipe(getManifest(filesAndFolders.debug, bundleNames.scripts, true));
});
// Copy across all files in filesAndFolders.styles to build/debug
gulp.task("styles-debug", ["clean"], function () {
return gulp
.src(filesAndFolders.styles, { base: filesAndFolders.base })
.pipe(gulp.dest(filesAndFolders.debugFolder));
});
// Create a manifest.json for the debug build - this should have lots of style files in
gulp.task("manifest-styles-debug", ["styles-debug", "manifest-scripts-debug"], function () {
return gulp
.src(filesAndFolders.styles, { base: filesAndFolders.base })
//.pipe(ignore.exclude("**/*.{ts,js.map}")) // Exclude ts and js.map files from the manifest (as they won't become script tags)
.pipe(getManifest(filesAndFolders.debug, bundleNames.styles, true));
});
// Concatenate & Minify JS for release into a single file
gulp.task("scripts-release", ["clean"], function () {
return gulp
.src(filesAndFolders.scripts)
.pipe(ignore.exclude("**/*.{ts,js.map}")) // Exclude ts and js.map files - not needed in release mode
.pipe(concat("app.js")) // Make a single file - if you want to see the contents then include the line below
//.pipe(gulp.dest(releaseFolder))
.pipe(uglify()) // Make the file titchy tiny small
.pipe(rev()) // Suffix a version number to it
.pipe(gulp.dest(filesAndFolders.releaseFolder)); // Write single versioned file to build/release folder
});
// Create a manifest.json for the release build - this should just have a single file for scripts
gulp.task("manifest-scripts-release", ["scripts-release"], function () {
return gulp
.src(filesAndFolders.buildBaseFolder + filesAndFolders.release + "/*.js")
.pipe(getManifest(filesAndFolders.release, bundleNames.scripts, false));
});
// Copy across all files in filesAndFolders.styles to build/debug
gulp.task("styles-release", ["clean"], function () {
return gulp
.src(filesAndFolders.styles)
.pipe(concat("app.css")) // Make a single file - if you want to see the contents then include the line below
//.pipe(gulp.dest(releaseFolder))
.pipe(minifyCss()) // Make the file titchy tiny small
.pipe(rev()) // Suffix a version number to it
.pipe(gulp.dest(filesAndFolders.releaseFolder + "/" + filesAndFolders.css)); // Write single versioned file to build/release folder
});
// Create a manifest.json for the debug build - this should have a single style files in
gulp.task("manifest-styles-release", ["styles-release", "manifest-scripts-release"], function () {
return gulp
.src(filesAndFolders.releaseFolder + "**/*.css")
.pipe(getManifest(filesAndFolders.release, bundleNames.styles, false, filesAndFolders.css + "/"));
});
// Copy across all fonts in filesAndFolders.fonts to both release and debug locations
gulp.task("fonts", ["clean"], function () {
return gulp
.src(filesAndFolders.fonts, { base: filesAndFolders.base })
.pipe(gulp.dest(filesAndFolders.debugFolder))
.pipe(gulp.dest(filesAndFolders.releaseFolder));
});
// Default Task
gulp.task("default", [
"scripts-debug", "manifest-scripts-debug", "styles-debug", "manifest-styles-debug",
"scripts-release", "manifest-scripts-release", "styles-release", "manifest-styles-release",
"fonts"
]);

What gulpfile.js does#

This file does a number of things each time it is run. First of all it deletes any build folder in the root of the web project so we're ready to build anew. Then it packages up files both for debug and for release mode. For debug it does the following:

  1. It copies the ts, js.map and js files declared in filesAndFolders.scripts to the build/debug folder preserving their original folder structure. (So, for example, app/app.ts, app/app.js.map and app/app.js will all end up at build/debug/app/app.ts, build/debug/app/app.js.map and build/debug/app/app.js respectively.) This is done to allow the continued debugging of the original TypeScript files when running in debug mode.
  2. It copies the css files declared in filesAndFolders.styles to the build/debug folder preserving their original folder structure. (So content/bootstrap.css will end up at build/debug/content/bootstrap.css.)
  3. It creates a build/manifest-debug.json file which contains details of all the script and style files that have been packaged up: ```json { "scripts":[ "scripts/angular.js", "scripts/angular-animate.js", "scripts/angular-route.js", "scripts/angular-sanitize.js", "scripts/angular-ui/ui-bootstrap-tpls.js", "scripts/toastr.js", "scripts/moment.js", "scripts/spin.js", "scripts/underscore.js", "app/app.js", "app/config.route.js", "app/common/common.js", "app/common/logger.js", "app/common/spinner.js", "app/common/bootstrap/bootstrap.dialog.js", "app/directives/imgPerson.js", "app/directives/serverError.js", "app/directives/sidebar.js", "app/directives/spinner.js", "app/directives/waiter.js", "app/directives/widgetClose.js", "app/directives/widgetHeader.js", "app/directives/widgetMinimize.js", "app/services/datacontext.js", "app/services/repositories.js", "app/services/repository.sage.js", "app/services/repository.saying.js", "app/about/about.js", "app/admin/admin.js", "app/dashboard/dashboard.js", "app/layout/shell.js", "app/layout/sidebar.js", "app/layout/topnav.js", "app/sages/sageDetail.js", "app/sages/sageEdit.js", "app/sages/sages.js", "app/sayings/sayingEdit.js", "app/sayings/sayings.js" ], "styles":[ "content/ie10mobile.css", "content/bootstrap.css", "content/font-awesome.css", "content/toastr.css", "content/styles.css" ] }

For release our gulpfile works with the same resources but has a different aim. Namely to minimise the the number of HTTP requests, obfuscate the code and version the files produced to prevent caching issues. To achieve those lofty aims it does the following:

  1. It concatenates together all the js files declared in filesAndFolders.scripts, minifies them and writes them to a single build/release/app-{xxxxx}.js file (where -{xxxxx} represents a version created by gulp-rev).
  2. It concatenates together all the css files declared in filesAndFolders.styles, minifies them and writes them to a single build/release/css/app-{xxxxx}.css file. The file is placed in a css subfolder because of relative paths specified in the CSS file.
  3. It creates a build/manifest-release.json file which contains details of all the script and style files that have been packaged up: ```json { "scripts":["app-95d1e06d.js"], "styles":["css/app-1a6256ea.css"] }
As you can see, the number of files included are reduced down to 2; 1 for JavaScript and 1 for CSS.
<!-- -->
Finally, for both the debug and release packages the contents of the `fonts` folder is copied across wholesale, preserving the original folder structure. This is because the CSS files contain relative references that point to the font files. If I had image files which were referenced by my CSS I'd similarly need to include these in the build process.
## Task Runner Explorer gets in on the action
The eagle eyed amongst you will also have noticed a peculiar first line to our `gulpfile.js`:
```js
/// <vs AfterBuild='default' />

This mysterious comment is actually how the Task Runner Explorer hooks our gulpfile.js into the Visual Studio build process. Our "magic comment" ensures that on the AfterBuild event, Task Runner Explorer runs the default task in our gulpfile.js. The reason we're using the AfterBuild event rather than the BeforeBuild event is because our project contains TypeScript and we need the transpiled JavaScript to be created before we can usefully run our package tasks. If we were using JavaScript alone then that wouldn't be an issue and either build event would do.

How do I use this in my HTML?#

Well this is magnificent - we have a gulpfile that builds our debug and release packages. The question now is, how do we use it?

Web Optimization made our lives really easy. Up in my head I had a @Styles.Render("~/Content/css") which pushed out my CSS and down at the foot of the body tag I had a @Scripts.Render("~/angularApp") which pushed out my script tags. Styles and Scripts are server-side utilities. It would be very easy to write equivalent utility classes that, depending on whether we were in debug or not, read the appropriate build/manifest-xxxxxx.json file and served up either debug or release style / script tags.

That would be pretty simple - and for what it's worth **simple is good

**. But today I felt like a challenge. What say server side rendering had been outlawed? A draconian ruling had been passed and all you had to play with was HTML / JavaScript and a server API that served up JSON? What would you do then? (All fantasy I know... But go with me on this - it's a journey.) Or more sensibly, what if you just want to remove some of the work your app is doing server-side to bundle and minify. Just serve up static assets instead. Spend less money in Azure why not?

Before I make all the changes let's review where we were. I had a single MVC view which, in terms of bundles, CSS and JavaScript pretty much looked like this:

<!DOCTYPE html>
<html>
<head>
<!-- ... -->
@Styles.Render("~/Content/css")
</head>
<body>
<!-- ... -->
@Scripts.Render("~/angularApp")
<script>
(function () {
$.getJSON('@Url.Content("~/Home/StartApp")')
.done(function (startUpData) {
var appConfig = $.extend({}, startUpData, {
appRoot: '@Url.Content("~/")',
remoteServiceRoot: '@Url.Content("~/api/")'
});
angularApp.start({
thirdPartyLibs: {
moment: window.moment,
toastr: window.toastr,
underscore: window._
},
appConfig: appConfig
});
});
})();
</script>
</body>
</html>

This is already more a complicated example than most peoples use cases. Essentially what's happening here is both bundles are written out as part of the HTML and then, once the scripts have loaded the Angular app is bootstrapped with some configuration loaded from the server by a good old jQuery AJAX call.

After reading an article about script loading by the magnificently funny Jake Archibald I felt ready. I cast my MVC view to the four winds and created myself a straight HTML file:

<!DOCTYPE html>
<html>
<head>
<!-- ... -->
</head>
<body>
<!-- ... -->
<script src="Scripts/jquery-2.1.1.min.js"></script>
<script>
(function () {
var appConfig = {};
var scriptsToLoad;
/**
* Handler which fires as each script loads
*/
function onScriptLoad(event) {
scriptsToLoad -= 1;
// Now all the scripts are present start the app
if (scriptsToLoad === 0) {
angularApp.start({
thirdPartyLibs: {
moment: window.moment,
toastr: window.toastr,
underscore: window._
},
appConfig: appConfig
});
}
}
// Load startup data from the server
$.getJSON("api/Startup")
.done(function (startUpData) {
appConfig = startUpData;
// Determine the assets folder depending upon whether in debug mode or not
var buildFolder = appConfig.appRoot + "build/";
var debugOrRelease = (appConfig.inDebug ? "debug" : "release");
var manifestFile = buildFolder + "manifest-" + debugOrRelease + ".json";
var outputFolder = buildFolder + debugOrRelease + "/";
// Load JavaScript and CSS listed in manifest file
$.getJSON(manifestFile)
.done(function (manifest){
manifest.styles.forEach(function (href) {
var link = document.createElement("link");
link.rel = "stylesheet";
link.media = "all";
link.href = outputFolder + href;
document.head.appendChild(link);
});
scriptsToLoad = manifest.scripts.length;
manifest.scripts.forEach(function (src) {
var script = document.createElement("script");
script.onload = onScriptLoad;
script.src = outputFolder + src;
script.async = false;
document.head.appendChild(script);
});
})
});
})();
</script>
</body>
</html>

If you very carefully compare the HTML above the MVC view that it replaces you can see the commonalities. They are doing pretty much the same thing - the only real difference is the bootstrapping API. Previously it was an MVC endpoint at /Home/StartApp. Now it's a Web API endpoint at api/Startup. Here's how it works:

  1. A jQuery AJAX call kicks off a call to load the bootstrapping / app config data. Importantly this data includes whether the app is running in debug or not.
  2. Depending on the isDebug flag the app either loads the build/manifest-debug.json or build/manifest-release.json manifest.
  3. For each CSS file in the styles bundle a link element is created and added to the page.
  4. For each JavaScript file in the scripts bundle a script element is created and added to the page.

It's worth pointing out that this also has a performance edge over Web Optimization as the assets are loaded asynchronously! (Yes I know it says script.async = false but that's not what you think it is... Go read Jake's article!)

To finish off I had to make a few tweaks to my web.config:

<!-- Allow ASP.Net to serve up JSON files -->
<system.webServer>
<staticContent>
<mimeMap fileExtension=".json" mimeType="application/json"/>
</staticContent>
</system.webServer>
<!-- The build folder (and it's child folder "debug") will not be cached.
When people are debugging they don't want to cache -->
<location path="build">
<system.webServer>
<staticContent>
<clientCache cacheControlMode="DisableCache"/>
</staticContent>
</system.webServer>
</location>
<!-- The release folder will be cached for a loooooong time
When you're in Production caching is your friend -->
<location path="build/release">
<system.webServer>
<staticContent>
<clientCache cacheControlMode="UseMaxAge"/>
</staticContent>
</system.webServer>
</location>

I want to publish, how do I include my assets?#

It's time for some csproj trickery. I must say I think I'll be glad to see the back of project files when ASP.Net vNext ships. This is what you need:

<Target Name="AfterBuild">
<ItemGroup>
<!-- what ever is in the build folder should be included in the project -->
<Content Include="build\**\*.*" />
</ItemGroup>
</Target>

What's happening here is that *after* a build Visual Studio considers the complete contents of the build folder to part of the project. It's after the build because the folder will be deleted and reconstructed as part of the build.

Caching and Cache-Busting in AngularJS with HTTP interceptors

Loading On-Demand and Caching#

I've written before about my own needs for caching and cache-busting when using RequireJS. Long story short, when I'm loading static resources (scripts / views etc) on demand from the server I want to do a little URL fiddling along the way. I want to do that to cater for these 2 scenarios:

  1. In Development - I want my URLs for static resources to have a unique querystring with each request to ensure that resources are loaded afresh each time. (eg so a GET request URL might look like this: "/app/layout/sidebar.html?v=IAmRandomYesRandomRandomIsWhatIAm58965782")
  2. In Production - I want my URLs for static resources to have a querystring with that is driven by the application version number. This means that static resources can potentially be cached with a given querystring - subsequent requests should result in a 304 status code (indicating “Not Modified”) and local cache should be used. But when a new version of the app is rolled out and the app version is incremented then the querystring will change and resources will be loaded anew. (eg a GET request URL might look like this: "/app/layout/sidebar.html?v=1.0.5389.16180")

Loading Views in AngularJS Using this Approach#

I have exactly the same use cases when I'm using AngularJS for views. Out of the box with AngularJS 1.x views are loaded lazily (unlike controllers, services etc). For that reason I want to use the same approach I've outlined above to load my views. Also, I want to prepend my URLs with the root of my application - this allows me to cater for my app being deployed in a virtual folder.

It turns out that's pretty easy thanks to HTTP interceptors. They allow you to step into the pipeline and access and modify requests and responses made by your application. When AngularJS loads a view it's the HTTP service doing the heavy lifting. So to deal with my own use case, I just need to add in an HTTP interceptor that amends the get request. This is handled in the example that follows in the configureHttpProvider function: (The example that follows is TypeScript - though if you just chopped out the interface and the type declarations you'd find this is pretty much idiomatic JavaScript)

interface config {
appRoot: string; // eg "/"
inDebug: boolean; // eg true or false
urlCacheBusterSuffix: string;