Skip to main content

25 posts tagged with "webpack"

View All Tags

ts-loader goes webpack 5

ts-loader has just released v9.0.0. This post goes through what this release is all about, and what it took to ship this version. For intrigue, it includes a brief scamper into my mental health along the way. Some upgrades go smoothly - this one had some hiccups. But we'll get into that.

hello world bicep

One big pull request#

As of v8, ts-loader supported webpack 4 and webpack 5. However the webpack 5 support was best efforts, and not protected by any automated tests. ts-loader has two test packs:

  1. A comparison test pack that compares transpilation and webpack compilation output with known outputs.
  2. An execution test pack that executes Karma test packs written in TypeScript using ts-loader.

The test packs were tightly coupled to webpack 4 (and in the case of the comparison test pack, that's unavoidable). The mission was to port ts-loader to be built against (and have an automated test pack that ran against) webpack 5.

This ended up being a very big pull request. Work on it started back in February 2021 and we're shipping now in April of 2021. I'd initially expected it would take a couple of days at most. I had underestimated.

A number of people collaborated on this PR, either with code, feedback, testing or even just responding to questions. So I'd like to say thank you to:

What's changed#

Let's go through what's different in v9. There's two breaking changes:

  • The minimum webpack version supported is now webpack 5. This simplifies the codebase, which previously had to if/else the various API registrations based on the version of webpack being used.
  • The minimum node version supported is now node 12. Node 10 reaches end of life status at the end of April 2021.

An interesting aspect of migrating to building against webpack 5 was dropping the dependency upon @types/webpack in favour of the types that now ship with webpack 5 itself. This was a mostly great experience; however we discovered some missing pieces.

Most notably, the LoaderContext wasn't strongly typed. LoaderContext is the value of this in the context of a running loader function. So it is probably the most interesting and important type from the perspective of a loader author.

Historically we used our own definition which had been adapted from the one in @types/webpack. I've looked into the possibility of a type being exposed in webpack itself. However, it turns out, it's complicated - with the LoaderContext type being effectively created across two packages. The type is initially created in webpack and then augmented later in loader-runner, prior to being supplied to loaders. You can read more on that here.

For now we've opted to stick with keeping an interface in ts-loader that models what arrives in the loader when executed. We have freshened it up somewhat, to model the webpack 5 world.

Alongside these changes, a number of dependencies were upgraded.

The hole#

By the 19th of February most of the work was done. However, we were experiencing different behaviour between Linux and Windows in our comparison test pack.

As far as I was aware, we were doing all the appropriate work to ensure ts-loader and our test packs worked cross platform. But we were still experiencing problems whenever we ran the test pack on Windows. I'd done no end of tweaking but nothing worked. I couldn't explain it. I couldn't fix it. I was finding that tough to deal with.

I really want to be transparent about the warts and all aspect of open source software development. It is like all other types of software development; sometimes things go wrong and it can be tough to work out why. Right then, I was really quite unhappy. Things weren't working code-wise and I was at a loss to say why. This is not something that I dig.

I also wasn't sleeping amazingly at this point. It was winter and we'd been in lockdown in the UK for three months; as the COVID-19 pandemic ground relentlessly on. I love my family dearly. I really do. With that said, having my children around whilst I attempted to work was remarkably tough. I love those guys but, woah, was it stressful.

I was feeling at a low ebb. And I wasn't sure what to do next. So, feeling tired and pretty fed up, I took a break.

"Anybody down there?"#

Time passed. In March Alexander Akait checked in to see how things were going and volunteered to help. He also suggested what turned out to be the fix; namely replacing usage of '\' with '/' in the assets supplied back to webpack. But crucially I implemented this wrong. Observe this commit:

const assetPath = path
.relative(compilation.compiler.outputPath, outputFile.name)
// According to @alexander-akait we should always '/' https://github.com/TypeStrong/ts-loader/pull/1251#issuecomment-799606985
.replace(/\//g, '/');

If you look closely at the replace you'll see that I'm globally replacing '/' with '/' rather than globally replacing '\' with '/'. The wasted time this caused... I could weep.

I generally thrashed around for a bit after this. Going in circles, like a six year old swimming wearing one armband. Then Tobias kindly volunteered to help. This much I've learned from a career in software: if talented people offer their assistance, grab it with both hands!

I'd been trying be as "learn in public" as possible about the issues I was facing on the pull request. The idea being, to surface the problems in a public forum where others can read and advise. And also to attempt a textual kind of rubber duck debugging.

When Tobias pitched in, I wanted to make it as easy as possible for him to help. So I wrote up a full description of what had changed. What the divergent behaviour in test packs looked like. I shared my speculation for what might be causing the issue (I was wrong by the way). Finally I provided a simple way to get up and running with the broken code. The easier I could make it for others to collaborate on this, I figured, the greater the likelihood of an answer. Tobias got to an answer quickly:

The problem is introduced due to some normalization logic in the test case: see #1273

While the PR fixes the problem, I think the paths should be normalized earlier in the pipeline to make this normalization code unnecessary. Note that asset names should have only / as they are filenames and not paths. Only absolute paths have \.

Tobias had raised a PR which introduced a workaround to resolved things in the test pack. This made me happy. More than that, he also identified that the issue lay in ts-loader itself. This caused me to look again at the changes I'd made, including my replace addition. With fresh eyes, I now realised this was a bug, and fixed it.

I found then that I could revert Tobias' workaround and still have passing tests. Result!

Release details#

Now that we've got there; we've shipped. You can get the latest version of ts-loader on npm and you can find the release details on GitHub.

Thanks everyone - I couldn't have done it without your help. 🌻❤️

Using TypeScript and ESLint with webpack (fork-ts-checker-webpack-plugin new feature!)

The fork-ts-checker-webpack-plugin has, since its inception, performed two classes of checking:

  1. Compilation errors which the TypeScript compiler surfaces up
  2. Linting issues which TSLint reports

You may have caught the announcement that TSLint is being deprecated and ESLint is the future of linting in the TypeScript world. This plainly has a bearing on linting in fork-ts-checker-webpack-plugin.

I've been beavering away at adding support for ESLint to the fork-ts-checker-webpack-plugin. I'm happy to say, the plugin now supports ESLint. Do you want to get your arms all around ESLint with fork-ts-checker-webpack-plugin? Read on!

How do you migrate from TSLint to ESLint?#

Well, first of all you need the latest and greatest fork-ts-checker-webpack-plugin. Support for ESLint shipped with v1.4.0.

You need to change the options you supply to the plugin in your webpack.config.js to look something like this:

new ForkTsCheckerWebpackPlugin({ eslint: true })

You'll also need the various ESLint related packages to your package.json:

yarn add eslint @typescript-eslint/parser @typescript-eslint/eslint-plugin --dev
  • eslint
  • @typescript-eslint/parser: The parser that will allow ESLint to lint TypeScript code
  • @typescript-eslint/eslint-plugin: A plugin that contains ESLint rules that are TypeScript specific

If you want, you can pass options to ESLint using the eslintOptions option as well. These will be passed through to the underlying ESLint CLI Engine when it is instantiated. Docs on the supported options are documented here.

Go Configure#

Now you're ready to use ESLint, you just need to give it some configuration. Typically, an .eslintrc.js is what you want here.

const path = require('path');
module.exports = {
parser: '@typescript-eslint/parser', // Specifies the ESLint parser
plugins: [
'@typescript-eslint',
],
env: {
browser: true,
jest: true
},
extends: [
'plugin:@typescript-eslint/recommended', // Uses the recommended rules from the @typescript-eslint/eslint-plugin
],
parserOptions: {
project: path.resolve(__dirname, './tsconfig.json'),
tsconfigRootDir: __dirname,
ecmaVersion: 2018, // Allows for the parsing of modern ECMAScript features
sourceType: 'module', // Allows for the use of imports
},
rules: {
// Place to specify ESLint rules. Can be used to overwrite rules specified from the extended configs
// e.g. "@typescript-eslint/explicit-function-return-type": "off",
'@typescript-eslint/explicit-function-return-type': 'off',
'@typescript-eslint/no-unused-vars': 'off',
},
};

If you're a React person (and I am!) then you'll also need: yarn add eslint-plugin-react. Then enrich your eslintrc.js a little:

const path = require('path');
module.exports = {
parser: '@typescript-eslint/parser', // Specifies the ESLint parser
plugins: [
'@typescript-eslint',
'react'
// 'prettier' commented as we don't want to run prettier through eslint because performance
],
env: {
browser: true,
jest: true
},
extends: [
'plugin:@typescript-eslint/recommended', // Uses the recommended rules from the @typescript-eslint/eslint-plugin
'prettier/@typescript-eslint', // Uses eslint-config-prettier to disable ESLint rules from @typescript-eslint/eslint-plugin that would conflict with prettier
// 'plugin:react/recommended', // Uses the recommended rules from @eslint-plugin-react
'prettier/react', // disables react-specific linting rules that conflict with prettier
// 'plugin:prettier/recommended' // Enables eslint-plugin-prettier and displays prettier errors as ESLint errors. Make sure this is always the last configuration in the extends array.
],
parserOptions: {
project: path.resolve(__dirname, './tsconfig.json'),
tsconfigRootDir: __dirname,
ecmaVersion: 2018, // Allows for the parsing of modern ECMAScript features
sourceType: 'module', // Allows for the use of imports
ecmaFeatures: {
jsx: true // Allows for the parsing of JSX
}
},
rules: {
// Place to specify ESLint rules. Can be used to overwrite rules specified from the extended configs
// e.g. "@typescript-eslint/explicit-function-return-type": "off",
'@typescript-eslint/explicit-function-return-type': 'off',
'@typescript-eslint/no-unused-vars': 'off',
// These rules don't add much value, are better covered by TypeScript and good definition files
'react/no-direct-mutation-state': 'off',
'react/no-deprecated': 'off',
'react/no-string-refs': 'off',
'react/require-render-return': 'off',
'react/jsx-filename-extension': [
'warn',
{
extensions: ['.jsx', '.tsx']
}
], // also want to use with ".tsx"
'react/prop-types': 'off' // Is this incompatible with TS props type?
},
settings: {
react: {
version: 'detect' // Tells eslint-plugin-react to automatically detect the version of React to use
}
}
};

You can add Prettier into the mix too. You can see how it is used in the above code sample. But given the impact that has on performance I wouldn't recommend it; hence it's commented out. There's a good piece by Rob Cooper's for more details on setting up Prettier and VS Code with TypeScript and ESLint.

Performance and Power Tools#

It's worth noting that support for TypeScript in ESLint is still brand new. As such, the rule of "Make it Work, Make it Right, Make it Fast" applies.... ESLint with TypeScript still has some performance issues which should be ironed out in the fullness of time. You can track them here.

This is important to bear in mind as, when I converted a large codebase over to using ESLint, I discovered that initial performance of linting was terribly slow. Something that's worth doing right now is identifying which rules are costing you most timewise and tweaking based on whether you think they're earning their keep.

The TIMING environment variable can be used to provide a report on the relative cost performance wise of running each rule. A nice way to plug this into your workflow is to add the cross-env package to your project: yarn add cross-env -D and then add 2 scripts to your package.json:

"lint": "eslint ./",
"lint-rule-timings": "cross-env TIMING=1 yarn lint"
  • lint - just runs the linter standalone
  • lint-rule-timings - does the same but with the TIMING environment variable set to 1 so a report will be generated.

I'd advise, making use of lint-rule-timings to identify which rules are costing you performance and then turning off rules as you need to. Remember, different rules have different value.

Finally, if you'd like to see how it's done, here's an example of porting from TSLint to ESLint.

TypeScript / webpack - you down with PnP? Yarn, you know me!

Yarn PnP is an innovation by the Yarn team designed to speed up module resolution by node. To quote the (excellent) docs:

Plug’n’Play is an alternative installation strategy unveiled in September 2018...

The way regular installs work is simple: Yarn generates a node_modules directory that Node is then able to consume. In this context, Node doesn’t know the first thing about what a package is: it only reasons in terms of files. “Does this file exist here? No? Let’s look in the parent node_modules then. Does it exist here? Still no? Too bad… parent folder it is!” - and it does this until it matches something that matches one of the possibilities. That’s vastly inefficient.

When you think about it, Yarn knows everything about your dependency tree - it evens installs it! So why is Node tasked with locating your packages on the disk? Why don’t we simply query Yarn, and let it tell us where to look for a package X required by a package Y? That’s what Plug’n’Play (abbreviated PnP) is. Instead of generating a node_modules directory and leaving the resolution to Node, we now generate a single .pnp.js file and let Yarn tell us where to find our packages.

Yarn has been worked upon, amongst others, by the excellent Maël Nison. You can hear him talking about it in person in this talk at JSConfEU.

Thanks particularly to Maël's work, it's possible to use Yarn PnP with TypeScript using webpack with ts-loader andfork-ts-checker-webpack-plugin. This post intends to show you just how simple it is to convert a project that uses either to work with Yarn PnP.

Vanilla ts-loader#

Your project is built using standalone ts-loader; i.e. a simple setup that handles both transpilation and type checking.

First things first, add this property to your package.json: (this is only required if you are using Yarn 1; this tag will be optional starting from the v2, where projects will switch to PnP by default.)

{
"installConfig": {
"pnp": true
}
}

Also, because this is webpack, we're going to need to add an extra dependency in the form of pnp-webpack-plugin:

yarn add -D pnp-webpack-plugin

To quote the excellent docs, make the following amends to your webpack.config.js:

const PnpWebpackPlugin = require(`pnp-webpack-plugin`);
module.exports = {
module: {
rules: [{
test: /\.ts$/,
loader: require.resolve('ts-loader'),
options: PnpWebpackPlugin.tsLoaderOptions(),
}],
},
resolve: {
plugins: [ PnpWebpackPlugin, ],
},
resolveLoader: {
plugins: [ PnpWebpackPlugin.moduleLoader(module), ],
},
};

If you have any options you want to pass to ts-loader, just pass them as parameter of pnp-webpack-plugin's tsLoaderOptions function and it will take care of forwarding them properly. Behind the scenes the tsLoaderOptions function is providing ts-loader with the options necessary to switch into Yarn PnP mode.

Congratulations; you now have ts-loader functioning with Yarn PnP support!

fork-ts-checker-webpack-plugin with ts-loader#

You may well be using fork-ts-checker-webpack-plugin to handle type checking whilst ts-loader gets on with the transpilation. This workflow is also supported using pnp-webpack-plugin. You'll have needed to follow the same steps as the ts-loader setup. It's just the webpack.config.js tweaks that will be different.

const PnpWebpackPlugin = require(`pnp-webpack-plugin`);
module.exports = {
plugins: {
new ForkTsCheckerWebpackPlugin(PnpWebpackPlugin.forkTsCheckerOptions({
useTypescriptIncrementalApi: false, // not possible to use this until: https://github.com/microsoft/TypeScript/issues/31056
})),
}
module: {
rules: [{
test: /\.ts$/,
loader: require.resolve('ts-loader'),
options: PnpWebpackPlugin.tsLoaderOptions({ transpileOnly: true }),
}],
},
resolve: {
plugins: [ PnpWebpackPlugin, ],
},
resolveLoader: {
plugins: [ PnpWebpackPlugin.moduleLoader(module), ],
},
};

Again if you have any options you want to pass to ts-loader, just pass them as parameter of pnp-webpack-plugin's tsLoaderOptions function. As we're using fork-ts-checker-webpack-plugin we're going to want to stop ts-loader doing type checking with the transpileOnly: true option.

We're now initialising fork-ts-checker-webpack-plugin with pnp-webpack-plugin's forkTsCheckerOptions function. Behind the scenes the forkTsCheckerOptions function is providing the fork-ts-checker-webpack-plugin with the options necessary to switch into Yarn PnP mode.

And that's it! You now have ts-loader and fork-ts-checker-webpack-plugin functioning with Yarn PnP support!

Living on the Bleeding Edge#

Whilst you can happily develop and build using Yarn PnP, it's worth bearing in mind that this is a new approach. As such, there's some rough edges right now.

If you're interested in Yarn PnP, it's worth taking the v2 of Yarn (Berry) for a spin. You can find it here: https://github.com/yarnpkg/berry. It's where most of the Yarn PnP work happens, and it includes zip loading - two birds, one stone!

Because there isn't first class support for Yarn PnP in TypeScript itself yet, you cannot make use of the Watch API through fork-ts-checker-webpack-plugin. (You can read about that issue here)

As you've likely noticed, the webpack configuration required makes for a noisy webpack.config.js. Further to that, VS Code (which is powered by TypeScript remember) has no support for Yarn PnP yet and so will present resolution errors to you. If you can ignore the sea of red squigglies all over your source files in the editor and just look at your webpack build you'll be fine.

There is a tool called PnPify that adds support for PnP to TypeScript (in particular tsc). You can find more information here: https://yarnpkg.github.io/berry/advanced/pnpify. For tsc it would be:

$> yarn pnpify tsc [...]

The gist is that it simulates the existence of node_modules by leveraging the data from the PnP file. As such it's not a perfect fix (pnp-webpack-plugin is a better integration), but it's a very useful tool to have to unblock yourself when using a project that doesn't support it.

PnPify actually allows us to use TypeScript in VSCode with PnP! Its documentation is here: https://yarnpkg.github.io/berry/advanced/pnpify#vscode-support

All of these hindrances should hopefully be resolved in future. Ideally, one day a good developer experience can be the default experience. In the meantime, you can still dev - just be prepared for the rough edges. Here's some useful resources to track the future of support:

This last one would be nice because:

  • We'd stop having to patch require
  • We probably wouldn't have to use yarn node if Node itself was able to find the loader somehow (such as if it was listed in the package.json metadata)

Thanks to Maël for his tireless work on Yarn. To my mind Maël is certainly a candidate for the hardest worker in open source. I've been shamelessly borrowing his excellent docs for this post - thanks for writing so excellently Maël!

TypeScript and high CPU usage - watch don't stare!

I'm one of the maintainers of the fork-ts-checker-webpack-plugin. Hi there!

Recently, various issues have been raised against create-react-app (which uses fork-ts-checker-webpack-plugin) as well as against the plugin itself. They've been related to the level of CPU usage in watch mode on idle; i.e. it's high!

Why High?#

Now, under the covers, the fork-ts-checker-webpack-plugin uses the TypeScript watch API.

The marvellous John (not me - another John) did some digging and discovered the root cause came down to the way that the TypeScript watch API watches files:

TS uses internally the fs.watch and fs.watchFile API functions of nodejs for their watch mode. The latter function is even not recommended by nodejs documentation for performance reasons, and urges to use fs.watch instead.

NodeJS doc:

Using fs.watch() is more efficient than fs.watchFile and fs.unwatchFile. fs.watch should be used instead of fs.watchFile and fs.unwatchFile when possible.

"there is another"#

John also found that there are other file watching behaviours offered by TypeScript. What's more, the file watching behaviour is configurable with an environment variable. That's right, if an environment variable called TSC_WATCHFILE is set, it controls the file watching approach used. Big news!

John did some rough benchmarking of the performance of the different options that be set on his PC running linux 64 bit. Here's how it came out:

ValueCPU usage on idle
TS default (TSC_WATCHFILE not set)7.4%
UseFsEventsWithFallbackDynamicPolling0.2%
UseFsEventsOnParentDirectory0.2%
PriorityPollingInterval6.2%
DynamicPriorityPolling0.5%
UseFsEvents0.2%

As you can see, the default performs poorly. On the other hand, an option like UseFsEventsWithFallbackDynamicPolling is comparative greasy lightning.

workaround!#

To get this better experience into your world now, you could just set an environment variable on your machine. However, that doesn't scale; let's instead look at introducing the environment variable into your project explicitly.

We're going to do this in a cross platform way using <a href="https://github.com/kentcdodds/cross-env">cross-env</a>. This is a mighty useful utility by Kent C Dodds which allows you to set environment variables in a way that will work on Windows, Mac and Linux. Imagine it as the jQuery of the environment variables world :-)

Let's add it as a devDependency:

yarn add -D cross-env

Then take a look at your package.json. You've probably got a start script that looks something like this:

"start": "webpack-dev-server --progress --color --mode development --config webpack.config.development.js",

Or if you're a create-react-app user maybe this:

"start": "react-scripts start",

Prefix your start script with cross-env TSC_WATCHFILE=UseFsEventsWithFallbackDynamicPolling. This will, when run, initialise an environment variable called TSC_WATCHFILE with the value UseFsEventsWithFallbackDynamicPolling. Then it will start your development server as it did before. When TypeScript is fired up by webpack it will see this environment variable and use it to configure the file watching behaviour to one of the more performant options.

So, in the case of a create-react-app user, your finished start script would look like this:

"start": "cross-env TSC_WATCHFILE=UseFsEventsWithFallbackDynamicPolling react-scripts start",

The Future#

There's a possibility that the default watch behaviour may change in TypeScript in future. It's currently under discussion, you can read more here.

The Big One Point Oh

It's time for the first major version of fork-ts-checker-webpack-plugin. It's been a long time coming :-)

A Little History#

The fork-ts-checker-webpack-plugin was originally the handiwork of Piotr Oleś. He raised an issue with ts-loader suggesting it could be the McCartney to ts-loader's Lennon:

Hi everyone!

I've created webpack plugin: fork-ts-checker-webpack-plugin that plays nicely with ts-loader. The idea is to compile project with transpileOnly: true and check types on separate process (async). With this approach, webpack build is not blocked by type checker and we have semantic check with fast incremental build. More info on github repo :)

So if you like it and you think it would be good to add some info in README.md about this plugin, I would be greatful.

Thanks :)

We did like it. We did think it would be good. We took him up on his kind offer.

Since that time many people have had their paws on the fork-ts-checker-webpack-plugin codebase. We love them all.

One Point Oh#

We could have had our first major release a long time ago. The idea first occurred when webpack 5 alpha appeared. "Huh, look at that, a major version number.... Maybe we should do that?" "Great idea chap - do it!" So here it is; fresh out the box: v1.0.0

There are actually no breaking changes that we're aware of; users of 0.x fork-ts-checker-webpack-plugin should be be able to upgrade without any drama.

Incremental Watch API on by Default#

Users of TypeScript 3+ may notice a performance improvement as by default the plugin now uses the incremental watch API in TypeScript.

Should this prove problematic you can opt out of using it by supplying useTypescriptIncrementalApi: false. We are aware of an issue with Vue and the incremental API. We hope it will be fixed soon - a generous member of the community is taking a look. In the meantime, we will not default to using the incremental watch API when in Vue mode.

Compatibility#

As it stands, the plugin supports webpack 2, 3, 4 and 5 alpha. It is compatible with TypeScript 2.1+ and TSLint 4+.

Right that's it - enjoy it! And thanks everyone for contributing - we really dig your help. Much love.

TypeScript and webpack: Watch It

All I ask for is a compiler and a tight feedback loop. Narrowing the gap between making a change to a program and seeing the effect of that is a productivity boon. The TypeScript team are wise cats and dig this. They've taken strides to improve the developer experience of TypeScript users by introducing a "watch" API which can be leveraged by other tools. To quote the docs:

TypeScript 2.7 introduces two new APIs: one for creating "watcher" programs that provide set of APIs to trigger rebuilds, and a "builder" API that watchers can take advantage of... This can speed up large projects with many files.

Recently the wonderful 0xorial opened a PR to add support for the watch API to the fork-ts-checker-webpack-plugin.

I took this PR for a spin on a large project that I work on. With my machine, I was averaging 12 seconds between incremental builds. (I will charitably describe the machine in question as "challenged"; hobbled by one of the most aggressive virus checkers known to mankind. Fist bump InfoSec 🤜🤛😉) Switching to using the watch API dropped this to a mere 1.5 seconds!

You Can Watch Too#

0xorial's PR was merged toot suite and was been released as [email protected]. If you'd like to take this for a spin then you can. Just:

  1. Up your version of the plugin to [email protected] in your package.json
  2. Add useTypescriptIncrementalApi: true to the plugin when you initialise it in your webpack.config.js.

That's it.

Mary Poppins#

Sorry, I was trying to paint a word picture of something you might watch that was also comforting. Didn't quite work...

Anyway, you might be thinking "wait, just hold on a minute.... he said @next - I am not that bleeding edge." Well, it's not like that. Don't be scared.

fork-ts-checker-webpack-plugin has merely been updated for webpack 5 (which is in alpha) and the @next reflects that. To be clear, the @next version of the plugin still supports (remarkably!) webpack 2, 3 and 4 as well as 5 alpha. Users of current and historic versions of webpack should feel safe using the @next version; for webpack 2, 3 and 4 expect stability. webpack 5 users should expect potential changes to align with webpack 5 as it progresses.

Roadmap#

This is available now and we'd love for you to try it out. As you can see, at the moment it's opt-in. You have to explicitly choose to use the new behaviour. Depending upon how testing goes, we may look to make this the default behaviour for the plugin in future (assuming users are running a high enough version of TypeScript). It would be great to hear from people if they have any views on that, or feedback in general.

Much ❤️ y'all. And many thanks to the very excellent 0xorial for the hard work.

You Might Not Need thread-loader

It all started with a GitHub issue. Ernst Ammann reported:

Without the thread-loader, compilation takes three to four times less time on changes. We could remove it.

If you're not aware of the webpack-config-plugins project then I commend it to you. Famously, webpack configuration can prove tricky. webpack-config-plugins borrows the idea of presets from Babel. It provides a number of pluggable webpack configurations which give a best practice setup for different webpack use cases. So if you're no expert with webpack and you want a good setup for building your TypeScript / Sass / JavaScript then webpack-config-plugins has got your back.

One of the people behind the project is the very excellent Jan Nicklas who is well known for his work on the html-webpack-plugin.

It was Jan who responded to Ernst's issue and decided to look into it.

All I Want For Christmas is Faster Builds#

Everyone wants fast builds. I do. You do. We all do. webpack-config-plugins is about giving these to the user in a precooked package.

There's a webpack loader called thread-loader which spawns multiple processes and splits up work between them. It was originally inspired by the work in the happypack project which does a similar thing.

I wrote a blog post some time ago which gave details about ways to speed up your TypeScript builds by combining the ts-loader project (which I manage) with the fork-ts-checker-webpack-plugin project (which I'm heavily involved with).

That post was written back in the days of webpack 2 / 3. It advocated use of both happypack / thread-loader to drop your build times even further. As you'll see, now that we're well into the world of webpack 4 (with webpack 5 waiting in the wings) the advantage of happypack / thread-loader are no longer so profound.

webpack-config-plugins follows the advice I set out in my post; it uses thread-loader in its pluggable configurations. Now, back to Ernst's issue.

thread-loader: Infinity War#

Jan quickly identified the problem. He did that rarest of things; he read the documentation which said:

// timeout for killing the worker processes when idle
// defaults to 500 (ms)
// can be set to Infinity for watching builds to keep workers alive
poolTimeout: 2000,

The webpack-config-plugins configurations (running in watch mode) were subject to the thread loaders being killed after 500ms. They got resurrected when they were next needed; but that's not as instant as you might hope. Jan then did a test:

(default pool - 30 runs - 1000 components ) average: 2.668068965517241
(no thread-loader - 30 runs - 1000 components ) average: 1.2674137931034484
(Infinity pool - 30 runs - 1000 components ) average: 1.371827586206896

This demonstrates that using thread-loader in watch mode with poolTimeout: Infinity performs significantly better than when it defaults to 500ms. But perhaps more significantly, not using thread-loader performs even better still.

"Maybe You've Thread Enough"#

When I tested using thread-loader in watch mode with poolTimeout: Infinity on my own builds I got the same benefit Jan had. I also got even more benefit from dropping thread-loader entirely.

A likely reason for this benefit is that typically when you're developing, you're working on one file at a time. Hence you only transpile one file at a time:

So there's not a great deal of value that thread-loader can add here; mostly it's twiddling thumbs and adding an overhead. To quote the docs:

Each worker is a separate node.js process, which has an overhead of ~600ms. There is also an overhead of inter-process communication.

Use this loader only for expensive operations!

Now, my build is not your build. I can't guarantee that you'll get the same results as Jan and I experienced; but I would encourage you to investigate if you're using thread-loader correctly and whether it's actually helping you. In these days of webpack 4+ perhaps it isn't.

There are still scenarios where thread-loader still provides an advantage. It can speed up production builds. It can speed up the initial startup of watch mode. In fact Jan has subsequently actually improved the thread-loader to that specific end. Yay Jan!

If this is all too much for you, and you want to hand off the concern to someone else then perhaps all of this serves as a motivation to just sit back, put your feet up and start using webpack-config-plugins instead of doing your own configuration.

ts-loader Project References: First Blood

So project references eh? They shipped with TypeScript 3. We've just shipped initial support for project references in ts-loader v5.2.0. All the hard work was done by the amazing Andrew Branch. In fact I'd recommend taking a gander at the PR. Yay Andrew!

This post will take us through the nature of the support for project references in ts-loader now and what we hope the future will bring. It rips off shamelessly

borrows from the README.md documentation that Andrew wrote as part of the PR. Because I am not above stealing.

TL;DR#

Using project references currently requires building referenced projects outside of ts-loader. We don’t want to keep it that way, but we’re releasing what we’ve got now. To try it out, you’ll need to pass projectReferences: true to loaderOptions.

Like tsc, but not like tsc --build#

ts-loader has partial support for project references in that it will load dependent composite projects that are already built, but will not currently build/rebuild those upstream projects. The best way to explain exactly what this means is through an example. Say you have a project with a project reference pointing to the lib/ directory:

tsconfig.json
app.ts
lib/
tsconfig.json
niftyUtil.ts

And we’ll assume that the root tsconfig.json has { "references": { "path": "lib" } }, which means that any import of a file that’s part of the lib sub-project is treated as a reference to another project, not just a reference to a TypeScript file. Before discussing how ts-loader handles this, it’s helpful to review at a really basic level what tsc itself does here. If you were to run tsc on this tiny example project, the build would fail with the error:

error TS6305: Output file 'lib/niftyUtil.d.ts' has not been built from source file 'lib/niftyUtil.ts'.

Using project references actually instructs tscnot to build anything that’s part of another project from source, but rather to look for any .d.ts and .js files that have already been generated from a previous build. Since we’ve never built the project in lib before, those files don’t exist, so building the root project fails. Still just thinking about how tsc works, there are two options to make the build succeed: either run tsc -p lib/tsconfig.jsonfirst, or simply run tsc --build, which will figure out that lib hasn’t been built and build it first for you.

Ok, so how is that relevant to ts-loader? Because the best way to think about what ts-loader does with project references is that it acts like tsc, but not like tsc --build. If you run ts-loader on a project that’s using project references, and any upstream project hasn’t been built, you’ll get the exact same error TS6305 that you would get with tsc. If you modify a source file in an upstream project and don’t rebuild that project, ts-loader won’t have any idea that you’ve changed anything—it will still be looking at the output from the last time you built that file.

“Hey, don’t you think that sounds kind of useless and terrible?”#

Well, sort of. You can consider it a work-in-progress. It’s true that on its own, as of today, ts-loader doesn’t have everything you need to take advantage of project references in webpack. In practice, though, consuming upstream projects and building upstream projects are somewhat separate concerns. Building them will likely come in a future release. For background, see the original issue.

outDir Windows problemo.#

At the moment, composite projects built using the outDir compiler option cannot be consumed using ts-loader on Windows. If you try to, ts-loader throws a "has not been built from source file" error. You can see Andrew and I puzzling over it in the PR. We don't know why yet; it's possible there's a bug in tsc. It's more likely there's a bug in ts-loader. Hopefully it's going to get solved at some point. (Hey, maybe you're the one to solve it!) Either way, we didn't want to hold back from releasing. So if you're building on Windows then avoid building composite projects using outDir.

It's Not Dead 2: mobx-react-devtools and the undead

I spent today digging through our webpack 4 config trying to work out why a production bundle contained code like this:

if("production"!==e.env.NODE_ENV){//...

My expectation was that with webpack 4 and 'mode': 'production' this meant that behind the scenes all process.env.NODE_ENV statements should be converted to 'production'. Subsequently Uglify would automatically get its groove on with the resulting if("production"!=="production") ... and et voilà!... Strip the dead code.

It seemed that was not the case. I was seeing (regrettably) undead code. And who here actually likes the undead?

Who Betrayed Me?#

My beef was with webpack. It done did me wrong. Or... So I thought. webpack did nothing wrong. It is pure and good and unjustly complained about. It was my other love: mobx. Or to be more specific: mobx-react-devtools.

It turns out that the way you use mobx-react-devtools reliably makes the difference. It's the cause of the stray ("production"!==e.env.NODE_ENV) statements in our bundle output. After a long time I happened upon this issue which contained a gem by one Giles Butler. His suggested way to reference mobx-react-devtools is (as far as I can tell) the solution!

On a dummy project I had the mobx-react-devtools advised code in place:

import * as React from 'react';
import { Layout } from './components/layout';
import DevTools from 'mobx-react-devtools';
export const App: React.SFC<{}> = _props => (
<div className="ui container">
<Layout />
{process.env.NODE_ENV !== 'production' ? <DevTools position={{ bottom: 20, right: 20 }} /> : null}
</div>
);

With this I had a build size of 311kb. Closer examination of my bundle revealed that my bundle.js was riddled with ("production"!==e.env.NODE_ENV) statements. Sucks, right?

Then I tried this instead:

import * as React from 'react';
import { Layout } from './components/layout';
const { Fragment } = React;
const DevTools = process.env.NODE_ENV !== 'production' ? require('mobx-react-devtools').default : Fragment;
export const App: React.SFC<{}> = _props => (
<div className="ui container">
<Layout />
<DevTools position={{ bottom: 20, right: 20 }} />
</div>
);

With this approach I got a build size of 191kb. This was thanks to the dead code being actually stripped. That's a saving of 120kb!

Perhaps We Change the Advice?#

There's a suggestion that the README should be changed to reflect this advice - until that happens, I wanted to share this solution. Also, I've a nagging feeling that I've missed something pertinent here; if someone knows something that I should... Tell me please!

webpack 4 - ts-loader / fork-ts-checker-webpack-plugin betas

The first webpack 4 beta dropped on Friday. Very exciting! Following hot on the heels of those announcements, I've some news to share too. Can you guess what it is?

ts-loader#

Yes! The ts-loader beta to work with webpack 4 is available. To get hold of the beta:

Remember to use this in concert with the webpack 4 beta. To see a working example take a look at the "vanilla" example.

fork-ts-checker-webpack-plugin#

There's more! You may like to use the <a href="https://github.com/Realytics/fork-ts-checker-webpack-plugin">fork-ts-checker-webpack-plugin</a>, (which goes lovely with ts-loader and a biscuit). There is a beta available for that too:

  • When using yarn: yarn add johnnyreilly/fork-ts-checker-webpack-plugin#4.0.0-beta.1 -D
  • When using npm: npm install johnnyreilly/fork-ts-checker-webpack-plugin#4.0.0-beta.1 -D

To see a working example take a look at the "fork-ts-checker" example.

PRs#

If you would like to track the progress of these betas then I encourage you to take a look at the PRs they were built from. The ts-loader PR can be found here. The fork-ts-checker-webpack-plugin PR can be found here.

These are betas so things may change further; though hopefully not significantly.

ts-loader 2017 retrospective

2017 is drawing to a close, and it's been a big, big year in webpack-land. It's been a big year for ts-loader too. At the start of the year v1.3.3 was the latest version available, officially supporting webpack 1. (Old school!) We end the year with ts-loader sitting pretty at v3.2.0 and supporting webpack 2 and 3.

Many releases were shipped and that was down to a whole bunch of folk. People helped out with bug fixes, features, advice and docs improvements. All of these help.ts-loader wouldn't be where it is without you so thanks to everyone that helped out - you rock!

I'm really grateful to all of you. Thanks so much! (Apologies for those I've missed anyone out - I know there's more still.)

fork-ts-checker-webpack-plugin build speed improvements#

Alongside other's direct contributions to ts-loader, other projects improved the experience of using ts-loader. Piotr Oleś dropped his <a href="https://github.com/Realytics/fork-ts-checker-webpack-plugin">fork-ts-checker-webpack-plugin</a> this year which nicely increased build speed when used with ts-loader.

That opened up the possibility of adding HappyPack support. I had the good fortune to work with webpack's Tobias Koppers and ExtraHop's Alex Birmingham on improving TypeScript build speed further.

So what does the future hold?

ts-loader 4.0 (Live webpack or Die Hard)#

The web marches on and webpack gallops alongside. Here's what's in the pipeline for ts-loader in 2018:

Start using the new watch API#

A new watch API is being made available in the TypeScript API. We have a PR from the amazing Sheetal Nandi which adds support to ts-loader. Given that's quite a big PR we want to merge that before anything else lands. The watch API is still being finalised but once it lands in TypeScript we'll look to merge the PR and ship a new version of ts-loader.

Drop custom module resolution#

Historically ts-loader has had it's own module resolution mechanism in place. We're going to look to move to use the TypeScript mechanism instead. The old module resolution be deprecated but will remain available behind a flag for a time. In future we'll look to drop the old mechanism entirely.

Drop support for TypeScript 2.3 and below#

The codebase can be made simpler if we drop support for older versions of TypeScript so that's what we plan to do with our next breaking changes release.

webpack v4 is in alpha now#

If any changes need to happen to ts-loader to support webpack 4 then they will be. Personally I'm planning to help out with <a href="https://github.com/Realytics/fork-ts-checker-webpack-plugin">fork-ts-checker-webpack-plugin</a> as there will likely be some changes required there.

contextAsConfigBasePath will be replaced with a context#

The option that landed in the last month doesn't quite achieve the aims of the original PR's author Christian Tinauer. Consequently it's going to be replaced with a new option. This is queued up and ready to go here.

reportFiles option to be added#

Michel Rasschaert is presently working on adding a reportFiles option to ts-loader. You can see the PR in progress here.

Merry Christmas!#

You can expect to see the first releases of ts-loader 4.0 in 2018. In the meantime, I'd like to wish you Merry Christmas and a Happy New Year! And once more, thanks and thanks again to all you generous people who help build ts-loader. You're wonderful and so I'm glad you do what you do... joyeux Noel!

The TypeScript webpack PWA

So, there you sit, conflicted. You've got a lovely build setup; it's a thing of beauty. Precious, polished like a diamond, sharpened like a circular saw. There at the core of your carefully crafted setup sits webpack. Heaving, mysterious... powerful.

There's more. Not only are you sold on webpack, you're all in TypeScript too. But now you've heard tell of "Progressive Web Applications" and "Service Workers".... And you want to be dealt in. You want to build web apps that work offline. It can't work can it? Your build setup's going to be like the creature in the episode where they've taken one too many jumps and it's gone into the foetal position.

So this is the plan kids. Let's take a simple TypeScript, webpack setup and make it a PWA. Like Victoria Wood said...

Let's Do It Tonight#

How to begin? Well first comes the plagiarism; here's a simple TypeScript webpack setup. Rob it. Stick a gun to its head and order it onto your hard drive. yarn install to pick up your dependencies and then yarn start to see what you've got. Something like this:

Beautiful right? And if we yarn build we end up with a simple output:

To test what we've built out we want to use a simple web server to serve up the dist folder. I've got the npm package http-server installed globally for just such an eventuality. So let's http-server ./dist and I'm once again looking at our simple app; it looks exactly the same as when I yarn start. Smashing. What would we see if we were offline? Well thanks to the magic of Chrome DevTools we can find out. Offline and refresh our browser...

Not very user friendly. Once we're done, we should be able to refresh and still see our app.

Work(box) It#

Workbox is a project that makes the setting up of Service Workers (aka the magic that powers PWAs) easier. It supports webpack use cases through the workbox-webpack-plugin; so let's give it a whirl. Incidentally, there's a cracking example on the Workbox site.

yarn add workbox-webpack-plugin --dev adds the plugin to our project. To make use of it, punt your way over to the webpack.production.config.js and add an entry for the plugin. We also need to set the hash parameter of the html-webpack-plugin to be false; if it's true it'll cause problems for the ServiceWorker.

const WorkboxPlugin = require('workbox-webpack-plugin');
//...
module.exports = {
//...
plugins: [
//...
new HtmlWebpackPlugin({
hash: false,
inject: true,
template: 'src/index.html',
minify: {
removeComments: true,
collapseWhitespace: true,
removeRedundantAttributes: true,
useShortDoctype: true,
removeEmptyAttributes: true,
removeStyleLinkTypeAttributes: true,
keepClosingSlash: true,
minifyJS: true,
minifyCSS: true,
minifyURLs: true,
},
}),
new WorkboxPlugin({
// we want our service worker to cache the dist directory
globDirectory: 'dist',
// these are the sorts of files we want to cache
globPatterns: ['**/*.{html,js,css,png,svg,jpg,gif,json}'],
// this is where we want our ServiceWorker to be created
swDest: path.resolve('dist', 'sw.js'),
// these options encourage the ServiceWorkers to get in there fast
// and not allow any straggling "old" SWs to hang around
clientsClaim: true,
skipWaiting: true,
}),
]
//...
};

With this in place, yarn build will generate a ServiceWorker. Now to alter our code to register it. Open up index.tsx and add this to the end of the file:

if ('serviceWorker' in navigator) {
window.addEventListener('load', () => {
navigator.serviceWorker.register('/sw.js').then(registration => {
// tslint:disable:no-console
console.log('SW registered: ', registration);
}).catch(registrationError => {
console.log('SW registration failed: ', registrationError);
});
});
}

Put it together and...

What Have We Got?#

Let's yarn build again.

Oooohh look! A service worker is with us. Does it work? Let's find out... http-server ./dist Browse to http://localhost:8080 and let's have a look at the console.

Looks very exciting. So now the test; let's go offline and refresh:

You are looking at the 200s of success. You're now running with webpack and TypeScript and you have built a Progressive Web Application. Feel good about life.

Working with Extrahop on webpack and ts-loader

I'm quite proud of this: https://www.extrahop.com/company/blog/2017/extrahop-webpack-accelerating-build-times/

If you didn't know, I spend a good amount of my spare time hacking on open source software. You may not know what that is. I would describe OSS as software made with ❤ by people, for other people to use.

You are currently reading this on a platform that was built using OSS. It's all around you, every day. It's on your phone, on your computer, on your TV. It's everywhere.

It's my hobby, it's part of my work. This specifically was one of those tremendously rare occasions when I got paid directly to work on my hobby, with people much brighter than me. It was brilliant. I loved it; it was a privilege.

Here's to Open Source!

fork-ts-checker-webpack-plugin code clickability

My name is John Reilly and I'm a VS Code addict. There I said it. I'm also a big fan of TypeScript and webpack. I've recently switched to using the awesome fork-ts-checker-webpack-plugin to speed up my builds.

One thing I love is using VS Code both as my editor and my terminal. Using the fork-ts-checker-webpack-plugin I noticed a problem when TypeScript errors showed up in the terminal:

Take a look at the red file location in the console above. What's probably not obvious from the above screenshot is that it is not clickable. I'm used to being able to click on link in the console and bounce straight to the error location. It's a really productive workflow; see a problem, click on it, be taken to the cause, fix it.

I want to click on "C:/source/ts-loader/examples/fork-ts-checker/src/fileWithError.ts(2,7)" and have VS Code open up fileWithError.ts, ideally at line 2 and column 7. But here it's not working. Why?

Well, I initially got this slightly wrong; I thought it was about the formatting of the file path. It is. I thought that having the line number and column number in parentheses after the path (eg "(2,7)") was screwing over VS Code. It isn't. Something else is. Look closely at the screenshot; what do you see? Do you notice how the colour of the line number / column number is different to the path? In the words of Delbert Wilkins: that's crucial.

Yup, the colour change between the path and the line number / column number is the problem. I've submitted a PR to fix this that I hope will get merged. In the meantime you can avoid this issue by dropping this code into your webpack.config.js:

var chalk = require("chalk");
var os = require("os");
function clickableFormatter(message, useColors) {
var colors = new chalk.constructor({ enabled: useColors });
var messageColor = message.isWarningSeverity() ? colors.bold.yellow : colors.bold.red;
var fileAndNumberColor = colors.bold.cyan;
var codeColor = colors.grey;
return [
messageColor(message.getSeverity().toUpperCase() + " in ") +
fileAndNumberColor(message.getFile() + "(" + message.getLine() + "," + message.getCharacter() + ")") +
messageColor(':'),
codeColor(message.getFormattedCode() + ': ') + message.getContent()
].join(os.EOL);
};
module.exports = {
// Other config...
module: {
rules: [
{
test: /\.tsx?$/,
loader: 'ts-loader',
options: { transpileOnly: true }
}
]
},
resolve: {
extensions: [ '.ts', '.tsx', 'js' ]
},
plugins: [
new ForkTsCheckerWebpackPlugin({ formatter: clickableFormatter }) // Here we get our clickability back
]
};

With that in place, what do you we have? This:

VS Code clickability; it's a beautiful thing.

TypeScript + Webpack: Super Pursuit Mode

This post also featured as a webpack Medium publication.

If you're like me then you'll like TypeScript and you'll like module bundling with webpack. You may also like speedy builds. That's completely understandable. The fact of the matter is, you sacrifice a bit of build speed to have webpack in the mix. Wouldn't it be great if we could even up the difference?

I'm the primary maintainer of ts-loader, a TypeScript loader for webpack. Just recently a couple of PRs were submitted that said, in other words: ts-loader is like this:

But it could be like this:

Apologies for the image quality above; there appear to be no high quality pictures out there of KITT in Super Pursuit Mode for me to defame with Garan Jenkin's atrocious puns.

fork-ts-checker-webpack-plugin#

"Faster type checking with forked process" read the enticing name of the issue. It turned out to be Piotr Oleś (@OlesDev) telling the world about his beautiful creation. He'd put together a mighty fine plugin that can be used alongside ts-loader called the fork-ts-checker-webpack-plugin. The name is a bit of a mouthful but the purpose is mouth-watering. To quote the README, it is a:

Webpack plugin that runs typescript type checker on a separate process.

What does this mean and how does this fit with ts-loader? Well, ts-loader does 2 jobs:

  1. It transpiles your TypeScript into JavaScript and hands it off to webpack
  2. It collects any TypeScript compilation errors and reports them to webpack

What this plugin does is say, "forget about #2 - we've got this." It removes the responsibility for type checking from ts-loader, so the only work ts-loader does is transpilation. In the meantime, the all important type checking is still happening. To be honest, there would be little reason to recommend this approach otherwise. The difference is fork-ts-checker-webpack-plugin is doing the heavy lifting in a separate process. This provides a nice performance boost to your workflow. ts-loader is doing less and that's a good thing

.

The approach used here is similar to that employed by awesome-typescript-loader. ATL is another TypeScript loader for webpack by the excellent Stanislav Panferov. ATL also has a technique for performing typechecking in a forked process. fork-ts-checker-webpack-plugin was an effort by Piotr to implement something similar but with improved incremental build performance.

How do we use it? Add fork-ts-checker-webpack-plugin as a devDependency of your project and then amend the webpack.config.js to set ts-loader into transpileOnly mode and drop the plugin into the mix:

var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');
var webpackConfig = {
// other config...
context: __dirname, // to automatically find tsconfig.json
module: {
rules: [
{
test: /\.tsx?$/,
loader: 'ts-loader',
options: {
// disable type checker - we will use it in fork plugin
transpileOnly: true
}
}
]
},
plugins: [
new ForkTsCheckerWebpackPlugin()
]
};

If you'd like to see an example of how to use the plugin then take a look at a simple example and a more involved one.

HappyPack#

Not so long ago I didn't know what happyness

HappyPack was. "Happiness in the form of faster webpack build times." That's what it is.

HappyPack makes webpack builds faster by allowing you to transform multiple files in parallel.

It does this by spinning up multiple threads, each with their own loaders inside. We wanted to do this with ts-loader; to have multiple instances of ts-loader running. Work can then be divided up across these separate loaders. Isn't multi-threading great?

ts-loader did not initially play nicely with HappyPack; essentially this is because ts-loader touches parts of webpack's API that HappyPack replaces. The entirely wonderful Artem Kozlov submitted a PR which added HappyPack support to ts-loader. Support essentially amounts to switching ts-loader to run in transpileOnly mode and ensuring that there is no attempt to talk to parts of the webpack API that HappyPack removes.

It would be hard to recommend using HappyPack as is because, as with transpileOnly mode you lose all typechecking. Where it becomes worthwhile is where it is combined with the fork-ts-checker-webpack-plugin so you keep the typechecking.

Enough with the chitter chatter; how can we achieve this? Add HappyPack as a devDependency of your project and then amend the webpack.config.js as follows:

var HappyPack = require('happypack');
var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');
module.exports = {
// other config...
context: __dirname, // to automatically find tsconfig.json
module: {
rules: [
{
test: /\.tsx?$/,
exclude: /node_modules/,
loader: 'happypack/loader?id=ts'
}
]
},
plugins: [
new HappyPack({
id: 'ts',
threads: 2,
loaders: [
{
path: 'ts-loader',
query: { happyPackMode: true }
}
]
}),
new ForkTsCheckerWebpackPlugin({ checkSyntacticErrors: true })
]
};

Note that the ts-loader options are now configured via the HappyPack query and that we're setting ts-loader with the happyPackMode option set.

There's one other thing to note which is important; we're now passing the checkSyntacticErrors option to the fork plugin. This ensures that the plugin checks for both syntactic errors (eg const array = [{} {}];) and semantic errors (eg const x: number = '1';). By default the plugin only checks for semantic errors. This is because when ts-loader is used with transpileOnly set, ts-loader will still report syntactic errors. But when used in happyPackMode it does not.

If you'd like to see an example of how to use HappyPack then once again we have a simple example and a more involved one.

thread-loader + cache-loader#

You might have some reservations about using HappyPack. First of all the quirky configuration required makes your webpack config rather less comprehensible. Also, HappyPack is not officially blessed by webpack. It is a side project developed externally from webpack and there's no guarantees that new versions of webpack won't break it. Neither of these are reasons not to use HappyPack but they are things to bear in mind.

What if there were a way to parallelise our builds which dealt with these issues? Well, there is! By using thread-loader and cache-loader in combination you can both feel happy that you're using an official webpack workflow and you can have a config that's less confusing.

What would that config look like? This:

var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');
module.exports = {
// other config...
context: __dirname, // to automatically find tsconfig.json
module: {
rules: {
test: /\.tsx?$/,
use: [
{ loader: 'cache-loader' },
{
loader: 'thread-loader',
options: {
// there should be 1 cpu for the fork-ts-checker-webpack-plugin
workers: require('os').cpus().length - 1,
},
},
{
loader: 'ts-loader',
options: {
happyPackMode: true // IMPORTANT! use happyPackMode mode to speed-up compilation and reduce errors reported to webpack
}
}
]
}
},
plugins: [
new ForkTsCheckerWebpackPlugin({ checkSyntacticErrors: true })
]
};

As you can see the configuration is much cleaner than with HappyPack. Interestingly ts-loader still needs to run in "happyPackMode" and that's because thread-loader is essentially behaving in the same fashion as with HappyPack and so ts-loader needs to behave in the same way. Probably ts-loader should have a more generic flag name than "happyPackMode". (Famously, naming things is hard; so if you've a good idea, tell me!)

These loaders are new and so tread carefully. My own experiences have been pretty positive but your mileage may vary. Do note that, as with HappyPack, the thread-loader is highly configurable.

If you'd like to see an example of how to use thread-loader and cache-loader then once again we have a simple example and a more involved one.

All This Could Be Yours...#

Wow! It looks like we can cut our build time by 4 minutes! #webpack@typescriptlang // cc @johnny_reillypic.twitter.com/gjvy9SLBAT

— Donald Pipowitch (@PipoPeperoni) June 23, 2017

In this post we're improving build speeds with TypeScript and webpack in 3 ways:

fork-ts-checker-webpack-plugin
With this plugin in play ts-loader only performs transpilation. ts-loader is doing less so the build is faster.
HappyPack
With HappyPack in the mix, the build is parallelised. That parallelisation means the build is faster.
thread-loader / cache-loader
With thread-loader and cache-loader, again the build is parallelised and the build is faster.

Dynamic import: I've been awaiting you...

One of the most exciting features to ship with TypeScript 2.4 was support for the dynamic import expression. To quote the release blog post:

Dynamic import expressions are a new feature in ECMAScript that allows you to asynchronously request a module at any arbitrary point in your program. These modules come back as Promises of the module itself, and can be await-ed in an async function, or can be given a callback with .then.

...

Many bundlers have support for automatically splitting output bundles (a.k.a. “code splitting”) based on these import() expressions, so consider using this new feature with the esnext module target. Note that this feature won’t work with the es2015 module target, since the feature is anticipated for ES2018 or later.

As the post makes clear, this adds support for a very bleeding edge ECMAScript feature. This is not fully standardised yet; it's currently at stage 3 on the TC39 proposals list. That means it's at the Candidate stage and is unlikely to change further. If you'd like to read more about it then take a look at the official proposal here.

Whilst this is super-new, we are still able to use this feature. We just have to jump through a few hoops first.

TypeScript Setup#

First of all, you need to install TypeScript 2.4. With that in place you need to make some adjustments to your tsconfig.json in order that the relevant compiler switches are flipped. What do you need? First of all you need to be targeting ECMAScript 2015 as a minimum. That's important specifically because ES2015 contained Promises which is what dynamic imports produce. The second thing you need is to target the module type of esnext. You're likely targeting es2015 now, esnext is that plus dynamic imports.

Here's a tsconfig.json I made earlier which has the relevant settings set:

{
"compilerOptions": {
"allowSyntheticDefaultImports": true,
"lib": [
"dom",
"es2015"
],
"target": "es2015",
"module": "esnext",
"moduleResolution": "node",
"noImplicitAny": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"removeComments": false,
"preserveConstEnums": true,
"sourceMap": true,
"skipLibCheck": true
}
}

Babel Setup#

At the time of writing, browser support for dynamic import is non-existent. This will likely be the case for some time but it needn't hold us back. Babel can step in here and compile our super-new JS into JS that will run in our browsers today.

You'll need to decide for yourself how much you want Babel to do for you. In my case I'm targeting old school browsers which don't yet support ES2015. You may not need to. However, the one thing that you'll certainly need is the Syntax Dynamic Import plugin. It's this that allows Babel to process dynamic import statements.

These are the options I'm passing to Babel:

var babelOptions = {
"plugins": ["syntax-dynamic-import"],
"presets": [
[
"es2015",
{
"modules": false
}
]
]
};

You're also going to need something that actually execute the imports. In my case I'm using webpack...

webpack#

webpack 2 supports import(). So if you webpack set up with ts-loader (or awesome-typescript-loader etc), chaining into babel-loader you should find you have a setup that supports dynamic import. That means a webpack.config.js that looks something like this:

var path = require('path');
var webpack = require('webpack');
var babelOptions = {
"plugins": ["syntax-dynamic-import"],
"presets": [
[
"es2015",
{
"modules": false
}
]
]
};
module.exports = {
entry: './app.ts',
output: {
filename: 'bundle.js'
},
module: {
rules: [{
test: /\.ts(x?)$/,
exclude: /node_modules/,
use: [
{
loader: 'babel-loader',
options: babelOptions
},
{
loader: 'ts-loader'
}
]
}, {
test: /\.js$/,
exclude: /node_modules/,
use: [
{
loader: 'babel-loader',
options: babelOptions
}
]
}]
},
resolve: {
extensions: ['.ts', '.tsx', '.js']
},
};

ts-loader example#

I'm one of the maintainers of ts-loader which is a TypeScript loader for webpack. When support for dynamic imports landed I wanted to add a test to cover usage of the new syntax with ts-loader.

We have 2 test packs for ts-loader, one of which is our "execution" test pack. It is so named because it works by spinning up webpack with ts-loader and then using karma to execute a set of tests. Each "test" in our execution test pack is actually a mini-project with its own test suite (generally jasmine but that's entirely configurabe). Each complete with its own webpack.config.js, karma.conf.js and either a typings.json or package.json for bringing in dependencies. So it's a full test of whether code slung with ts-loader and webpack actually executes when the output is plugged into a browser.

This is the test pack for dynamic imports:

import a from "../src/a";
import b from "../src/b";
describe("app", () => {
it("a to be 'a' and b to be 'b' (classic)", () => {
expect(a).toBe("a");
expect(b).toBe("b");
});
it("import results in a module with a default export", done => {
import("../src/c").then(c => {
// .default is the default export
expect(c.default).toBe("c");
done();
}
});
it("import results in a module with an export", done => {
import("../src/d").then(d => {
// .default is the default export
expect(d.d).toBe("d");
done();
}
});
it("await import results in a module with a default export", async done => {
const c = await import("../src/c");
// .default is the default export
expect(c.default).toBe("c");
done();
});
it("await import results in a module with an export", async done => {
const d = await import("../src/d");
expect(d.d).toBe("d");
done();
});
});

As you can see, it's possible to use the dynamic import as a Promise directly. Alternatively, it's possible to consume the imported module using TypeScripts support for async / await. For my money the latter option makes for much clearer code.

If you're looking for a complete example of how to use the new syntax then you could do worse than taking the existing test pack and tweaking it to your own ends. The only change you'd need to make is to strip out the resolveLoader statements in webpack.config.js and karma.conf.js. (They exist to lock the test in case to the freshly built ts-loader stored locally. You'll not need this.)

You can find the test in question here. Happy code splitting!

Under the Duck: An Afternoon in Open Source

Have you ever wondered what happens behind the scenes of open source projects? One that I'm involved with is ts-loader; a TypeScript loader for webpack. Yesterday was an interesting day in the life of ts-loader and webpack; things unexpectedly broke. Oh and don't worry, they're fixed now.

How things panned out reflects well on the webpack community. I thought it might be instructive to take a look at the legs furiously paddling underneath the duck of open source. What follows is a minute by minute account of my life on the afternoon of Wednesday 22nd February 2017:

3:55pm#

I'm sat at my desk in the City of London. I have to leave at 4pm to go to the dentist. I'm working away on a project which is built and bundled using ts-loader and webpack. However, having just npm installed and tried to spin up webpack in watch mode, I discover that everything is broken. Watch mode is not working - there's an error being thrown in ts-loader. It's to do with a webpack property called mtimes. ts-loader depends upon it and it looks like it is no longer always passed through. Go figure. ### 4:01pm

I've got to go. I'm 15 minutes from Bank station. So, I grab my bag and scarper out the door. On my phone I notice an issue has been raised - other people are being affected by the problem too. As I trot down the various alleys that lead to the station I wonder whether I can work around this issue. Using GitHub to fork, edit code and submit a PR on a mobile phone is possible. Just. But it's certainly not easy...

My PR is in, the various test packs are starting to execute somewhere out there in Travis and Appveyor-land. Then I notice Ed Bishop has submitted a near identical PR. Yay Ed! I'm always keen to encourage people to contribute and so I intend to merge that PR rather than my own.

16:12#

Rubbish. The Waterloo and City Line is out of action. I need to get across London to reach Waterloo or I'll miss my appointment. It's time to start running....

16:15#

It's rather nagging at me that behaviour has changed without warning. This has been reliably in place the entire time I've been involved with ts-loader / webpack. Why now? I don't see any obvious mentions on the webpack GitHub repo. So I head over to the webpack Slack channel and ask: (conversation slightly abridged)

johnny_reilly#

Hey all, has something happened to mtimes? Behaviour seems to have changed - now undefined occasionally during watch mode. A PR has been raised against ts-loader to work around this https://github.com/TypeStrong/ts-loader/pull/480#issuecomment-281714600

However I'm wondering if this should actually be merged given behaviour has changed unexpectedly

sokra#

ah...

i removed it. I thought it was unused.

johnny_reilly#

It's definitely not!

sokra#

it's not in the public API^^

Any reason why you are not using getTimes()?

...

johnny_reilly#

Okay, I'm on a train and won't be near a computer for a while. ts-loader is presently broken because it depends on mtimes. Would it be possible for you to add this back at least for now. I'm aware many people depend on ts-loader and are now broken. #### sokra

sure, I readd it but deprecate it.

...

sean.larkin#

@sokra is this the change you just made for that watchpack bug fix? Or unlrelated, just wanted to track if I didn't already have the change/issue #### sokra

https://github.com/webpack/watchpack/pull/48

johnny_reilly#

This is what the present code does:

const watcher = watching.compiler.watchFileSystem.watcher ||
watching.compiler.watchFileSystem.wfs.watcher

And then .mtimes

Should I be able to do .getTimes() instead?

sokra#

actually you can't rely on watchFileSystem being NodeJsWatchFileSystem. But this is another topic

...

but yes

johnny_reilly#

Thanks @sokra - when I get to a keyboard I'll swap mtimes for getTimes() and report back.

17:28#

Despite various trains being out of action / missing in action I've made it to the dentists; phew! I go in for my checkup and plan to take a look at the issue later that evening. In the meantime I've hoping that Tobias (Sokra) will get chance to republish so that ts-loader users aren't too impacted.

18:00#

Done at the dentist and I'm heading home. Whilst I've been opening wide and squinting at the ceiling, TypeScript 2.2 has shipped. Whilst this is super exciting, according to Greenkeeper, the new version has broken the build. Arrrrghhhh...

I start to look into this and realise we're not broken because of TypeScript 2.2; we were broken because of the mtimes. Tobias has now re-added mtimes and published. With that in place I requeue a build and.... drum roll.... we're green!

The good news just keeps on coming as Luka Zakrajšek has submitted a PR which uses getTimes() in place of mtimes. And the tests pass. Awesome! MERGE. I just need to cut a release and we're done.

18:15#

I'm home. My youngest son has been suffering from chicken pox all week and as a result my wife has been in isolation, taking care of him. We chat whilst the boys watch Paw Patrol as the bath runs. I flick open the laptop and start doing the various housekeeping tasks around cutting a release. This is interrupted by various bathtime / bedtime activities and I abandon work for now.

19:30#

The boys are down and I get on with the release; updating the changelog, bumping the version number and running the tests. For various reasons this takes longer than it normally does.

20:30#

Finally we're there; ts-loader 2.0.1 ships: https://github.com/TypeStrong/ts-loader/releases/tag/v2.0.1.

I'm tremendously grateful to everyone that helped out - thank you all!

ts-loader 2.0.1 has shipped; thanks @wsokra@bancek and @mredbishop https://t.co/I00c7sJyFo#typescript

— John Reilly (@johnny_reilly) February 22, 2017

webpack: resolveLoader / alias with query / options

Sometimes you write a post for the ages. Sometimes you write one you hope is out of date before you hit "publish". This is one of those.

There's a bug in webpack's enhanced-resolve. It means that you cannot configure an aliased loader using the query (or options in the webpack 2 nomenclature). Let me illustrate; consider the following code:

module.exports = {
// ...
module: {
loaders: [
{
test: /\.ts$/,
loader: 'ts-loader',
query: {
entryFileIsJs: true
}
}
]
}
}
module.exports.resolveLoader = { alias: { 'ts-loader': require('path').join(__dirname, "../../index.js")

At the time of writing, if you alias a loader as above, then the query / options will *not* be passed along. This is bad, particularly given the requirement in webpack 2 that configuration is no longer possible through extending the webpack.config.js. So what to do? Well, when this was a problem previously the marvellous James Brantly had a workaround. I've taken that and run with it:

var config = {
// ...
module: {
loaders: [
{
test: /\.ts$/,
loader: 'ts-loader',
query: {
entryFileIsJs: true
}
}
]
}
}
module.exports = config;
var loaderAliasPath = require('path').join(__dirname, "../../../index.js");
var rules = config.module.loaders || config.module.rules;
rules.forEach(function(rule) {
var options = rule.query || rule.options;
rule.loader = rule.loader.replace('ts-loader', loaderAliasPath + (options ? '?' + JSON.stringify(options) : ''));
});

This approach stringifies the query / options and suffixes it to the aliased path. This works as long as the options you're passing are JSON-able (yes it's a word).

As I said earlier; hopefully by the time you read this the workaround will no longer be necessary again. But just in case....

webpack: configuring a loader with query / options

webpack 2 is on it's way. As one of the maintainers of ts-loader I've been checking out that ts-loader works with webpack 2. It does: phew!

ts-loader has a continuous integration build that runs against webpack 1. When webpack 2 ships we're planning to move to running CI against webpack 2. However, webpack 2 has some breaking changes. The one that's particularly of relevance to our test packs is that a strict schema is now enforced for webpack.config.js with webpack 2. This has been the case since webpack 2 hit beta 23. Check the PR that added it. You can see some of the frankly tortured discussion that this generated as well.

Let's all take a moment and realise that working on open source is sometimes a rather painful experience. Take a breath. Breathe out. Ready to carry on? Great.

There are 2 ways to configure loader options for ts-loader (and in fact this stands for most loaders). Loader options can be set either using a query when specifying the loader or through the ts (insert the name of alternative loaders here) property in the webpack.config.js.

The implicatations of the breaking change are: with webpack 2 you can no longer configure ts-loader (or any other loader) with a ts (insert the name of alternative loaders here) property in the webpack.config.js. It must be done through the query / options. The following code is no longer valid with webpack 2:

module.exports = {
...
module: {
loaders: [{
test: /\.tsx?$/,
loader: 'ts-loader'
}]
},
// specify option using `ts` property - **only do this if you are using webpack 1**
ts: {
transpileOnly: false
}
}

This change means that we have needed to adjust how our test pack works. We can no longer make use of ts for configuration. Since I wasn't terribly aware of query I thought it made sense to share my learnings.

What exactly is query / options?#

Good question. Well, strictly speaking it's 2 possible things; both ways to configure a webpack loader. Classically query was a string which could be appended to the name of the loader much like a <a href="https://en.wikipedia.org/wiki/Query_string">query string</a> but actually with greater powers:

module.exports = {
...
module: {
loaders: [{
test: /\.tsx?$/,
loader: 'ts-loader?' + JSON.stringify({
transpileOnly: false
})
}]
}
}

But it can also be a separately specified object that's supplied alongside a loader (I understand this is relatively new behaviour):

module.exports = {
...
module: {
loaders: [{
test: /\.tsx?$/,
loader: 'ts-loader'
query: {
transpileOnly: false
}
}]
}
}

webpack 2 is coming - look busy!#

So if you're planning to move to webpack 2, be aware of this breaking change. You can start moving to using configuration via query right now with webpack 1. You don't need to be using webpack 2 to make the jump. So jump!

Finally, and by way of a PS, query is renamed to options in webpack 2; a much better name to my mind. There's actually a bunch of other renames on the way as well - check out the migration guide for more on this. The important thing to note is that the old names work in webpack 2. But you should plan to move to the new naming at some point as they'll likely disappear when webpack 3 ships.

webpack: syncing the enhanced-resolve

Like Captain Ahab I resolve to sync the white whale that is webpack's <a href="https://github.com/webpack/enhanced-resolve">enhanced-resolve</a>... English you say? Let me start again:

So, you're working on a webpack loader. (In my case the typescript loader; <a href="https://github.com/TypeStrong/ts-loader">ts-loader</a>) You have need of webpack's resolve capabilities. You dig around and you discover that that superpower is lodged in the very heart of the enhanced-resolve package. Fantastic. But wait, there's more: your needs are custom. You need a sync, not an async resolver. (Try saying that quickly.) You regard the description of enhanced-resolve with some concern:

"Offers an async require.resolve function. It's highly configurable."

Well that doesn't sound too promising. Let's have a look at the docs. Ah. Hmmm. You know how it goes with webpack. Why document anything clearly when people could just guess wildly until they near insanity and gibber? Right? It's well established that webpack's attitude to docs has been traditionally akin to Gordon Gecko's view on lunch.

In all fairness, things are beginning to change on that front. In fact the new docs look very promising. But regrettably, the docs on the enhanced-resolve repo are old school. Which is to say: opaque. However, I'm here to tell you that if a sync resolver is your baby then, contrary to appearances, enhanced-resolve has your back.

Sync, for lack of a better word, is good#

Nestled inside enhanced-resolve is the <a href="https://github.com/webpack/enhanced-resolve/blob/3f3f4cd1fcbafa1e98c3c6470fed1277817ed607/lib/ResolverFactory.js">ResolverFactory.js</a> which can be used to make a resolver. However, you can supply it with a million options and that's just like giving someone a gun with a predilection for feet.

What you want is an example of how you could make a sync resolver. Well, surprise surprise it's right in front of your nose. Tucked away in <a href="https://github.com/webpack/enhanced-resolve/blob/3f3f4cd1fcbafa1e98c3c6470fed1277817ed607/lib/node.js">node.js</a> (I do *not* get the name) is exactly what you're after. It contains a number of factory functions which will construct a ready-made resolver for you; sync or async. Perfect! So here's how I'm rolling:

const node = require("enhanced-resolve/lib/node");
function makeSyncResolver(options) {
return node.create.sync(options.resolve);
}
const resolveSync = makeSyncResolver(loader.options);

The loader options used above you'll be familiar with as the resolve section of your webpack.config.js. You can read more about them here and here.

What you're left with at this point is a function; a resolveSync function if you will that takes 3 arguments:

context
I don't know what this is. So when using the function I just supply undefined; and that seems to be OK. Weird, right?
path
This is the path to your code (I think). So, a valid value to supply - handily lifted from the ts-loader test pack - would be: C:\source\ts-loader\.test\babel-issue92
request
The actual module you're interested in; so using the same test the relevant value would be ./submodule/submodule

Put it all together and what have you got?

const resolvedFileName = resolveSync(
undefined,
'C:\source\ts-loader\.test\babel-issue92',
'./submodule/submodule'
);
// resolvedFileName: C:\source\ts-loader\.test\babel-issue92\submodule\submodule.tsx

Boom.

Understanding Webpack's DefinePlugin (and using with TypeScript)

I've been searching for a way to describe what the DefinePlugin actually does. The docs say*:

Define free variables. Useful for having development builds with debug logging or adding global constants.

* Actually that should read "used to say". I've made some changes to the official docs.... (Surprisingly easy to do that by the way; it's just a wiki you can edit at will.)

I think I would describe it thusly: the DefinePlugin allows you to create global constants which can be configured at compile time. I find this very useful for allowing different behaviour between development builds and release builds. This post will demonstrate usage of this approach, talk about what's actually happening and how to get this working nicely with TypeScript.

If you just want to see this in action then take a look at this repo and keep your eyes open for usage of __VERSION__ and __IN_DEBUG__.

What Globals?#

For our example we want to define 2 global constants; a string called __VERSION__ and a boolean called __IN_DEBUG__. The names are deliberately wacky to draw attention to the fact that these are not your everyday, common-or-garden variables. Them's "special". These constants will be initialised with different values depending on whether we are in a debug build or a production build. Usage of these constants in our code might look like this:

if (__IN_DEBUG__) {
console.log(`This app is version ${ __VERSION__ }`);
}

So, if __IN_DEBUG__ is set to true this code would log out to the console the version of the app.

Configuring our Globals#

To introduce these constants to webpack we're going to add this to our webpack configuration:

var webpack = require('webpack');
// ...
plugins: [
new webpack.DefinePlugin({
__IN_DEBUG__: JSON.stringify(false),
__VERSION__: JSON.stringify('1.0.0.' + Date.now())
}),
// ...
]
// ...

What's going on here? Well, each key of the object literal above represents one of our global constants. When you look at the value, just imagine each outer JSON.stringify( ... ) is not there. It's just noise. Imagine instead that you're seeing this:

__IN_DEBUG__: false,
__VERSION__: '1.0.0.' + Date.now()

A little clearer, right? __IN_DEBUG__ is given the boolean value false and __VERSION__ is given the string value of 1.0.0. plus the ticks off of Date.now(). What's happening here is well explained in Pete Hunt's excellent webpack howto: "definePlugin takes raw strings and inserts them". JSON.stringify facilitates this; it produces a string representation of a value that can be inlined into code. When the inlining takes place the actual output would be something like this:

if (false) { // Because at compile time, __IN_DEBUG__ === false
console.log(`This app is version ${ "1.0.0.1469268116580" }`); // And __VERSION__ === "1.0.0.1469268116580"
}

And if you've got some UglifyJS or similar in the mix then, in the example above, this would actually strip out the statement above entirely since it's clearly a NOOP. Yay the dead code removal! If __IN_DEBUG__ was false then (perhaps obviously) this statement would be left in place as it wouldn't be dead code.

TypeScript and Define#

The final piece of the puzzle is making TypeScript happy. It doesn't know anything about our global constants. So we need to tell it:

declare var __IN_DEBUG__: boolean;
declare var __VERSION__: string;

And that's it. Compile time constants are a go!

The Mysterious Case of Webpack, Angular and jQuery

You may know that Angular ships with a cutdown version of jQuery called jQLite. It's still possible to use the full-fat jQuery; to quote the docs:

To use jQuery, simply ensure it is loaded before the angular.js file.

Now the wording rather implies that you're not using any module loader / bundler. Rather that all files are being loaded via script tags and relies on the global variables that result from that. True enough, if you take a look at the Angular source you can see how this works:

// bind to jQuery if present;
var jqName = jq();
jQuery = isUndefined(jqName) ? window.jQuery : // use jQuery (if present)
!jqName ? undefined : // use jqLite
window[jqName]; // use jQuery specified by `ngJq`

Amongst other things it looks for a jQuery variable which has been placed onto the window object. If it is found then jQuery is used; if it is not then it's jqLite all the way.

But wait! I'm using webpack#

Me too! And one of the reasons is that we get to move away from reliance upon the global scope and towards proper modularisation. So how do we get Angular to use jQuery given the code we've seen above? Well, your first thought might be to npm install yourself some jQuery and then make sure you've got something like this in your entry file:

import "jquery"; // This'll fix it... Right?
import * as angular from "angular";

Wrong.

You need the ProvidePlugin#

In your webpack.config.js you need to add the following entry to your plugins:

new webpack.ProvidePlugin({
"window.jQuery": "jquery"
}),

This uses the webpack <a href="https://github.com/webpack/docs/wiki/list-of-plugins#provideplugin">ProvidePlugin</a> and, at the point of webpackification (© 2016 John Reilly) all references in the code to window.jQuery will be replaced with a reference to the webpack module that contains jQuery. So when you look at the bundled file you'll see that the code that checks the window object for jQuery has become this:

jQuery = isUndefined(jqName) ? __webpack_provided_window_dot_jQuery : // use jQuery (if present)
!jqName ? undefined : // use jqLite
window[jqName]; // use jQuery specified by `ngJq`

That's right; webpack is providing Angular with jQuery whilst still not placing a jQuery variable onto the window. Neat huh?

Inlining Angular Templates with WebPack and TypeScript

This technique actually applies to pretty much any web stack where you have to supply templates; it just so happens that I'm using Angular 1.x in this case. Also I have an extra technique which is useful to handle the ng-include scenario.

Preamble#

For some time I've been using webpack to bundle my front end. I write ES6 TypeScript; import statements and all. This is all sewn together using the glorious ts-loader to compile and emit ES6 code which is handed off to the wonderful babel-loader which transpiles it to ESold code. All with full source map support. It's wonderful.

However, up until now I've been leaving Angular to perform the relevant http requests at runtime when it needs to pull in templates. That works absolutely fine but my preference is to preload those templates. In fact I've written before about using the gulp angular template cache to achieve just that aim.

So I was wondering; in this modular world what would be the equivalent approach? Sure I could still use the gulp angular template cache approach but I would like something a little more deliberate and a little less magic. Also, I've discovered (to my cost) that when using the existing approach, it's possible to break the existing implementation without realising it; only finding out there's a problem in Production when unexpected http requests start happening. Finding these problems out at compile time rather than runtime is always to be strived for. So how?

raw-loader!#

raw-loader allows you load file content using require statements. This works well with the use case of inlining html. So I drop it into my webpack.config.js like so:

var path = require('path');
module.exports = {
cache: true,
entry: {
main: './src/main.ts',
vendor: [
'babel-polyfill',
'angular',
'angular-animate',
'angular-sanitize',
'angular-ui-bootstrap',
'angular-ui-router'
]
},
output: {
path: path.resolve(__dirname, './dist/scripts'),
filename: '[name].js',
chunkFilename: '[chunkhash].js'
},
module: {
loaders: [{
test: /\.ts(x?)$/,
exclude: /node_modules/,
loader: 'babel-loader?presets[]=es2015!ts-loader'
}, {
test: /\.js$/,
exclude: /node_modules/,
loader: 'babel',
query: {
presets: ['es2015']
}
}, { // THIS IS THE MAGIC!
test: /\.html$/,
exclude: /node_modules/,
loader: 'raw'
}] // THAT WAS THE MAGIC!
},
plugins: [
// ....
],
resolve: {
extensions: ['', '.ts', '.tsx', '.js']
}
};

With this in place, if someone requires a file with the html suffix then raw-loader comes in. So now we can swap this:

$stateProvider
.state('state1', {
url: "/state1",
templateUrl: "partials/state1.html"
})

For this:

$stateProvider
.state('state1', {
url: "/state1",
template: require("./partials/state1.html")
})

Now initially TypeScript is going to complain about your require statement. That's fair; outside of node-land it doesn't know what require is. No bother, you just need to drop in a one line simple definition file to sort this out; let me present webpack-require.d.ts:

declare var require: (filename: string) => any;

You've now inlined your template. And for bonus points, if you were to make a mistake in your path then webpack would shout at you at compile time; which is a good, good thing.

ng-include#

The one use case that this doesn't cover is where your templates import other templates through use of the ng-include directive. They will still trigger http requests as the templates are served. The simple way to prevent that is by priming the angular <a href="https://docs.angularjs.org/api/ng/service/$templateCache">$templateCache</a> like so:

app.run(["$templateCache",
($templateCache: ng.ITemplateCacheService) => {
$templateCache.put("justSome.html", require("./justSome.html"));
// Other templates go here...
}]);

Now when the app spins up it already has everything it needs pre-cached.

Visual Studio, tsconfig.json and external TypeScript compilation

TypeScript first gained support for tsconfig.json back with the 1.5 release. However, to my lasting regret and surprise Visual Studio will not be gaining meaningful support for it until TypeScript 1.8 ships. However, if you want it now, it's already available to use in beta.

I've already leapt aboard. Whilst there's a number of bugs in the beta it's still totally usable. So use it.

External TypeScript Compilation and the VS build#

Whilst tsconfig.json is useful and super cool it has limitations. It allows you to deactivate compilation upon file saving using compileOnSave. What it doesn't allow is deactivation of the TypeScript compilation that happens as part of a Visual Studio build. That may not matter for the vanilla workflow of just dropping TypeScript files in a Visual Studio web project and having VS invoke the TypeScript compilation. However it comes to matter when your workflow deviates from the norm, as mine does. Using external compilation of TypeScript within Visual Studio is a little tricky. My own use case is somewhat atypical but perhaps not uncommon.

I'm working on a project which has been built using TypeScript since TS 0.9. Not surprisingly, this started off using the default Visual Studio / TypeScript workflow. Active development on the project ceased around 9 months ago. Now it's starting up again. It's a reasonable sized web app and the existing functionality is, in the main, fine. But the users want to add some new screens.

Like any developer, I want to build with the latest and greatest. In my case, this means I want to write modular ES6 using TypeScript. With this approach my code can be leaner and I have less script ordering drama in my life. (Yay import statements!) This can be done by bringing together webpack, TypeScript (ts-loader) and Babel (babel-loader). There's an example of this approach here. Given the size of the existing codebase I'd rather leave the legacy TypeScript as is and isolate my new approach to the screens I'm going to build. Obviously I'd like to have a common build process for all the codebase at some point but I've got a deadline to meet and so a half-old / half-new approach is called for (at least for the time being).

Goodbye TypeScript Compilation in VS#

Writing modular ES6 TypeScript which is fully transpiled to old-school JS is not possible using the Visual Studio tooling at present. For what it's worth I think that SystemJS compilation may make this more possible in the future but I don't really know enough about it to be sure. That's why I'm bringing webpack / Babel into the mix right now. I don't want Visual Studio to do anything for the ES6 code; I don't want it to compile. I want to deactivate my TypeScript compilation for the ES6 code. I can't do this from the tsconfig.json so I'm in a bit of a hole. What to do?

Well, as of (I think) TypeScript 1.7 it's possible to deactivate TypeScript compilation in Visual Studio. To quote:

there is an easier way to disable TypeScriptCompile:

Just add &lt;TypeScriptCompileBlocked&gt;true&lt;/TypeScriptCompileBlocked&gt; to the .csproj, e.g. in the first &lt;PropertyGroup&gt;.

Awesomeness!

But wait, this means that the legacy TypeScript isn't being compiled any longer. Bear in mind, I'm totally happy with the existing / legacy TypeScript compilation. Nooooooooooooooo!!!!!!!!!!!!!!!

Hello TypeScript Compilation outside VS#

Have no fear, I gotcha. What we're going to do is ensure that Visual Studio triggers 2 external TypeScript builds as part of its own build process:

  • The modular ES6 TypeScript (new)
  • The legacy TypeScript (old)

How do we do this? Through the magic of build targets. We need to add this to our .csproj: (I add it near the end; I'm not sure if location matters though)

<PropertyGroup>
<CompileDependsOn>
$(CompileDependsOn);
WebClientBuild;
</CompileDependsOn>
<CleanDependsOn>
$(CleanDependsOn);
WebClientClean
</CleanDependsOn>
<CopyAllFilesToSingleFolderForPackageDependsOn>
CollectGulpOutput;
CollectLegacyTypeScriptOutput;
$(CopyAllFilesToSingleFolderForPackageDependsOn);
</CopyAllFilesToSingleFolderForPackageDependsOn>
<CopyAllFilesToSingleFolderForMsdeployDependsOn>
CollectGulpOutput;
CollectLegacyTypeScriptOutput;
$(CopyAllFilesToSingleFolderForPackageDependsOn);
</CopyAllFilesToSingleFolderForMsdeployDependsOn>
</PropertyGroup>
<Target Name="WebClientBuild">
<Exec Command="npm install" />
<Exec Command="npm run build-legacy-typescript" />
<Exec Command="npm run build -- --mode $(ConfigurationName)" />
</Target>
<Target Name="WebClientClean">
<Exec Command="npm run clean" />
</Target>
<Target Name="CollectGulpOutput">
<ItemGroup>
<_CustomFiles Include="dist\**\*" />
<FilesForPackagingFromProject Include="%(_CustomFiles.Identity)">
<DestinationRelativePath>dist\%(RecursiveDir)%(Filename)%(Extension)</DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
<Message Text="CollectGulpOutput list: %(_CustomFiles.Identity)" />
</Target>
<Target Name="CollectLegacyTypeScriptOutput">
<ItemGroup>
<_CustomFiles Include="Scripts\**\*.js" />
<FilesForPackagingFromProject Include="%(_CustomFiles.Identity)">
<DestinationRelativePath>Scripts\%(RecursiveDir)%(Filename)%(Extension)</DestinationRelativePath>
</FilesForPackagingFromProject>
</ItemGroup>
<Message Text="CollectLegacyTypeScriptOutput list: %(_CustomFiles.Identity)" />
</Target>

There's a few things going on here; let's take them one by one.

The WebClientBuild Target#

This target triggers our external builds. One by one it runs the following commands:

npm install
Installs the npm packages.
npm run build-legacy-typescript
Runs the "build-legacy-typescript"script in our package.json
npm run build -- --mode $(ConfigurationName)
Runs the "build"script in our package.json and passes through a mode parameter of either "Debug" or "Release" from MSBuild - indicating whether we're creating a debug or a release build.

As you've no doubt gathered, I'm following the convention of using the scripts element of my package.json as repository for the various build tasks I might have for a web project. It looks like this:

{
// ...
"scripts": {
"test": "karma start --reporters mocha,junit --single-run --browsers PhantomJS",
"build-legacy-typescript": "tsc -v&&tsc --project Scripts",
"clean": "gulp delete-dist-contents",
"watch": "gulp watch",
"build": "gulp build"
},
// ...
}

As you can see, "build-legacy-typescript" invokes tsc (which is registered as a devDependency) to print out the version of the compiler. Then it invokes tsc again using the project flag directly on the Scripts directory. This is where the legacy TypeScript and its associated tsconfig.json resides. Et voilá, the old / existing TypeScript is compiled just as it was previously by VS itself.

Next, the "build" invokes a gulp task called, descriptively, "build". This task caters for our brand new codebase of modular ES6 TypeScript. When run, this task will invoke webpack, copy static files, build less etc. Quick digression: you can see there's a "watch" script that does the same thing on a file-watching basis; I use that during development.

The WebClientClean Target#

The task that runs to clean up artefacts created by WebClientBuild.

The CollectLegacyTypeScriptOutput and CollectGulpOutput Targets#

Since we're compiling our TypeScript outside of VS we need to tell MSBuild / MSDeploy about the externally compiled assets in order that they are included in the publish pipeline. Here I'm standing on the shoulders of Steve Cadwallader's excellent post. Thanks Steve!

CollectLegacyTypeScriptOutput and CollectGulpOutput respectively include all the built files contained in the "Scripts" and "dist" folders when a publish takes place. You don't need this for when you're building on your own machine but if you're looking to publish (either from your machine or from TFS) then you will need exactly this. Believe me that last sentence was typed with a memory of great pain and frustration.

So in the end, as far as TypeScript is concerned, I'm using Visual Studio solely as an editor. It's the hooks in the .csproj that ensure that compilation happens. It seems a little quirky that we still need to have the original TypeScript targets in the .csproj file as well; but it works. That's all that matters.

ES6 + TypeScript + Babel + React + Flux + Karma: The Secret Recipe

I wrote a while ago about how I was using some different tools in a current project:

  • React with JSX
  • Flux
  • ES6 with Babel
  • Karma for unit testing

I have fully come to love and appreciate all of the above. I really like working with them. However. There was still an ache in my soul and a thorn in my side. Whilst I love the syntax of ES6 and even though I've come to appreciate the clarity of JSX, I have been missing something. Perhaps you can guess? It's static typing.

It's actually been really good to have the chance to work without it because it's made me realise what a productivity boost having static typing actually is. The number of silly mistakes burning time that a compiler could have told me.... Sigh.

But the pain is over. The dark days are gone. It's possible to have strong typing, courtesy of TypeScript, plugged into this workflow. It's yours for the taking. Take it. Take it now!

What a Guy Wants#

I decided a couple of months ago what I wanted to have in my setup:

  1. I want to be able to write React / JSX in TypeScript. Naturally I couldn't achieve that by myself but handily the TypeScript team decided to add support for JSX with TypeScript 1.6. Ooh yeah.
  2. I wanted to be able to write ES6. When I realised the approach for writing ES6 and having the transpilation handled by TypeScript wasn't clear I had another idea. I thought "what if I write ES6 and hand off the transpilation to Babel?" i.e. Use TypeScript for type checking, not for transpilation. I realised that James Brantly had my back here already. Enter Webpack and ts-loader.
  3. Debugging. Being able to debug my code is non-negotiable for me. If I can't debug it I'm less productive. (I'm also bitter and twisted inside.) I should say that I wanted to be able to debug my original source code. Thanks to the magic of sourcemaps, that mad thing is possible.
  4. Karma for unit testing. I've become accustomed to writing my tests in ES6 and running them on a continual basis with Karma. This allows for a rather good debugging story as well. I didn't want to lose this when I moved to TypeScript. I didn't.

So I've talked about what I want and I've alluded to some of the solutions that there are. The question now is how to bring them all together. This post is, for the most part, going to be about correctly orchestrating a number of gulp tasks to achieve the goals listed above. If you're after the Blue Peter "here's one I made earlier" moment then take a look at the es6-babel-react-flux-karma repo in the Microsoft/TypeScriptSamples repo on Github.

gulpfile.js#

/* eslint-disable no-var, strict, prefer-arrow-callback */
'use strict';
var gulp = require('gulp');
var gutil = require('gulp-util');
var connect = require('gulp-connect');
var eslint = require('gulp-eslint');
var webpack = require('./gulp/webpack');
var staticFiles = require('./gulp/staticFiles');
var tests = require('./gulp/tests');
var clean = require('./gulp/clean');
var inject = require('./gulp/inject');
var lintSrcs = ['./gulp/**/*.js'];
gulp.task('delete-dist', function (done) {
clean.run(done);
});
gulp.task('build-process.env.NODE_ENV', function () {
process.env.NODE_ENV = 'production';
});
gulp.task('build-js', ['delete-dist', 'build-process.env.NODE_ENV'], function(done) {
webpack.build().then(function() { done(); });
});
gulp.task('build-other', ['delete-dist', 'build-process.env.NODE_ENV'], function() {
staticFiles.build();
});
gulp.task('build', ['build-js', 'build-other', 'lint'], function () {
inject.build();
});
gulp.task('lint', function () {
return gulp.src(lintSrcs)
.pipe(eslint())
.pipe(eslint.format());
});
gulp.task('watch', ['delete-dist'], function() {
process.env.NODE_ENV = 'development';
Promise.all([
webpack.watch()//,
//less.watch()
]).then(function() {
gutil.log('Now that initial assets (js and css) are generated inject will start...');
inject.watch(postInjectCb);
}).catch(function(error) {
gutil.log('Problem generating initial assets (js and css)', error);
});
gulp.watch(lintSrcs, ['lint']);
staticFiles.watch();
tests.watch();
});
gulp.task('watch-and-serve', ['watch'], function() {
postInjectCb = stopAndStartServer;
});
var postInjectCb = null;
var serverStarted = false;
function stopAndStartServer() {
if (serverStarted) {
gutil.log('Stopping server');
connect.serverClose();
serverStarted = false;
}
startServer();
}
function startServer() {
gutil.log('Starting server');
connect.server({
root: './dist',
port: 8080
});
serverStarted = true;
}

Let's start picking this apart; what do we actually have here? Well, we have 2 gulp tasks that I want you to notice:

build

This is likely the task you would use when deploying. It takes all of your source code, builds it, provides cache-busting filenames (eg main.dd2fa20cd9eac9d1fb2f.js), injects your shell SPA page with references to the files and deploys everything to the ./dist/ directory. So that's TypeScript, static assets like images and CSS all made ready for Production.

The build task also implements this advice:

When deploying your app, set the NODE_ENV environment variable to production to use the production build of React which does not include the development warnings and runs significantly faster.
watch-and-serve

This task represents "development mode" or "debug mode". It's what you'll likely be running as you develop your app. It does the same as the build task but with some important distinctions.

  • As well as building your source it also runs your tests using Karma
  • This task is not triggered on a once-only basis, rather your files are watched and each tweak of a file will result in a new build and a fresh run of your tests. Nice eh?
  • It spins up a simple web server and serves up the contents of ./dist (i.e. your built code) in order that you can easily test out your app.
  • In addition, whilst it builds your source it does not minify your code and it emits sourcemaps. For why? For debugging! You can go to http://localhost:8080/ in your browser of choice, fire up the dev tools and you're off to the races; debugging like gangbusters. It also doesn't bother to provide cache-busting filenames as Chrome dev tools are smart enough to not cache localhost.
  • Oh and Karma.... If you've got problems with a failing test then head to http://localhost:9876/ and you can debug the tests in your dev tools.
  • Finally, it runs ESLint in the console. Not all of my files are TypeScript; essentially the build process (aka "gulp-y") files are all vanilla JS. So they're easily breakable. ESLint is there to provide a little reassurance on that front.

Now let's dig into each of these in a little more detail

WebPack#

Let's take a look at what's happening under the covers of webpack.build() and webpack.watch().

WebPack with ts-loader and babel-loader is what we're using to compile our ES6 TypeScript. ts-loader uses the TypeScript compiler to, um, compile TypeScript and emit ES6 code. This is then passed on to the babel-loader which transpiles it from ES6 down to ES-old-school. It all gets brought together in 2 files; main.js which contains the compiled result of the code written by us and vendor.js which contains the compiled result of 3rd party / vendor files. The reason for this separation is that vendor files are likely to change fairly rarely whilst our own code will constantly be changing. This separation allows for quicker compile times upon file changes as, for the most part, the vendor files will not need to included in this process.

Our gulpfile.js above uses the following task:

'use strict';
var gulp = require('gulp');
var gutil = require('gulp-util');
var webpack = require('webpack');
var WebpackNotifierPlugin = require('webpack-notifier');
var webpackConfig = require('../webpack.config.js');
function buildProduction(done) {
// modify some webpack config options
var myProdConfig = Object.create(webpackConfig);
myProdConfig.output.filename = '[name].[hash].js';
myProdConfig.plugins = myProdConfig.plugins.concat(
// make the vendor.js file with cachebusting filename
new webpack.optimize.CommonsChunkPlugin({ name: 'vendor', filename: 'vendor.[hash].js' }),
new webpack.optimize.DedupePlugin(),
new webpack.optimize.UglifyJsPlugin()
);
// run webpack
webpack(myProdConfig, function(err, stats) {
if(err) { throw new gutil.PluginError('webpack:build', err); }
gutil.log('[webpack:build]', stats.toString({
colors: true
}));
if (done) { done(); }
});
}
function createDevCompiler() {
// show me some sourcemap love people
var myDevConfig = Object.create(webpackConfig);
myDevConfig.devtool = 'inline-source-map';
myDevConfig.debug = true;
myDevConfig.plugins = myDevConfig.plugins.concat(
// Make the vendor.js file
new webpack.optimize.CommonsChunkPlugin({ name: 'vendor', filename: 'vendor.js' }),
new WebpackNotifierPlugin({ title: 'Webpack build', excludeWarnings: true })
);
// create a single instance of the compiler to allow caching
return webpack(myDevConfig);
}
function buildDevelopment(done, devCompiler) {
// run webpack
devCompiler.run(function(err, stats) {
if(err) { throw new gutil.PluginError('webpack:build-dev', err); }
gutil.log('[webpack:build-dev]', stats.toString({
chunks: false, // dial down the output from webpack (it can be noisy)
colors: true
}));
if (done) { done(); }
});
}
function bundle(options) {
var devCompiler;
function build(done) {
if (options.shouldWatch) {
buildDevelopment(done, devCompiler);
} else {
buildProduction(done);
}
}
if (options.shouldWatch) {
devCompiler = createDevCompiler();
gulp.watch('src/**/*', function() { build(); });
}
return new Promise(function(resolve, reject) {
build(function (err) {
if (err) {
reject(err);
} else {
resolve('webpack built');
}
});
});
}
module.exports = {
build: function() { return bundle({ shouldWatch: false }); },
watch: function() { return bundle({ shouldWatch: true }); }
};

Hopefully this is fairly self-explanatory; essentially buildDevelopment performs the development build (providing sourcemap support) and buildProduction builds for Production (providing minification support). Both are driven by this webpack.config.js:

/* eslint-disable no-var, strict, prefer-arrow-callback */
'use strict';
var path = require('path');
module.exports = {
cache: true,
entry: {
// The entry point of our application; the script that imports all other scripts in our SPA
main: './src/main.tsx',
// The packages that are to be included in vendor.js
vendor: [
'babel-polyfill',
'events',
'flux',
'react'
]
},
// Where the output of our compilation ends up
output: {
path: path.resolve(__dirname, './dist/scripts'),
filename: '[name].js',
chunkFilename: '[chunkhash].js'
},
module: {
loaders: [{
// The loader that handles ts and tsx files. These are compiled
// with the ts-loader and the output is then passed through to the
// babel-loader. The babel-loader uses the es2015 and react presets
// in order that jsx and es6 are processed.
test: /\.ts(x?)$/,
exclude: /node_modules/,
loader: 'babel-loader?presets[]=es2015&presets[]=react!ts-loader'
}, {
// The loader that handles any js files presented alone.
// It passes these to the babel-loader which (again) uses the es2015
// and react presets.
test: /\.js$/,
exclude: /node_modules/,
loader: 'babel',
query: {
presets: ['es2015', 'react']
}
}]
},
plugins: [
],
resolve: {
// Files with the following extensions are fair game for webpack to process
extensions: ['', '.webpack.js', '.web.js', '.ts', '.tsx', '.js']
},
};

Inject#

Your compiled output needs to be referenced from some kind of HTML page. So we've got this:

<!doctype html>
<html lang="en">
<head>
<meta charSet="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>ES6 + Babel + React + Flux + Karma: The Secret Recipe</title>
<!-- inject:css -->
<!-- endinject -->
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.5/css/bootstrap.min.css">
</head>
<body>
<div id="content"></div>
<!-- inject:js -->
<!-- endinject -->
</body>
</html>

Which is no more than a boilerplate HTML page with a couple of key features:

  • a single &lt;div /&gt; element in the &lt;body /&gt; which is where our React app is going to be rendered.
  • &lt;!-- inject:css --&gt; and &lt;!-- inject:js --&gt; placeholders where css and js is going to be injected by gulp-inject.
  • a single &lt;link /&gt; to the Bootstrap CDN. This sample app doesn't actually serve up any css generated as part of the project. It could but it doesn't. When it comes to injection time no css will actually be injected. This has been left in place as, more typically, a project would have some styling served up.

This is fed into our inject task in inject.build() and inject.watch(). They take css and javascript and, using our shell template, create a new page which has the css and javascript dropped into their respective placeholders:

'use strict';
var gulp = require('gulp');
var inject = require('gulp-inject');
var glob = require('glob');
function injectIndex(options) {
var postInjectCb = options.postInjectCb;
var postInjectCbTriggerId = null;
function run() {
var target = gulp.src('./src/index.html');
var sources = gulp.src([
//'./dist/styles/main*.css',
'./dist/scripts/vendor*.js',
'./dist/scripts/main*.js'
], { read: false });
return target
.on('end', function() { // invoke postInjectCb after 1s
if (postInjectCbTriggerId || !postInjectCb) { return; }
postInjectCbTriggerId = setTimeout(function() {
postInjectCb();
postInjectCbTriggerId = null;
}, 1000);
})
.pipe(inject(sources, { ignorePath: '/dist/', addRootSlash: false, removeTags: true }))
.pipe(gulp.dest('./dist'));
}
var jsCssGlob = 'dist/**/*.{js,css}';
function checkForInitialFilesThenRun() {
glob(jsCssGlob, function (er, files) {
var filesWeNeed = ['dist/scripts/main', 'dist/scripts/vendor'/*, 'dist/styles/main'*/];
function fileIsPresent(fileWeNeed) {
return files.some(function(file) {
return file.indexOf(fileWeNeed) !== -1;
});
}
if (filesWeNeed.every(fileIsPresent)) {
run('initial build');
} else {
checkForInitialFilesThenRun();
}
});
}
checkForInitialFilesThenRun();
if (options.shouldWatch) {
gulp.watch(jsCssGlob, function(evt) {
if (evt.path && evt.type === 'changed') {
run(evt.path);
}
});
}
}
module.exports = {
build: function() { return injectIndex({ shouldWatch: false }); },
watch: function(postInjectCb) { return injectIndex({ shouldWatch: true, postInjectCb: postInjectCb }); }
};

This also triggers the server to serve up the new content.

Static Files#

Your app will likely rely on a number of static assets; images, fonts and whatnot. This script picks up the static assets you've defined and places them in the dist folder ready for use:

'use strict';
var gulp = require('gulp');
var cache = require('gulp-cached');
var targets = [
// In my own example I don't use any of the targets below, they
// are included to give you more of a feel of how you might use this
{ description: 'FONTS', src: './fonts/*', dest: './dist/fonts' },
{ description: 'STYLES', src: './styles/*', dest: './dist/styles' },
{ description: 'FAVICON', src: './favicon.ico', dest: './dist' },
{ description: 'IMAGES', src: './images/*', dest: './dist/images' }
];
function copy(options) {
// Copy files from their source to their destination
function run(target) {
gulp.src(target.src)
.pipe(cache(target.description))
.pipe(gulp.dest(target.dest));
}
function watch(target) {
gulp.watch(target.src, function() { run(target); });
}
targets.forEach(run);
if (options.shouldWatch) {
targets.forEach(watch);
}
}
module.exports = {
build: function() { return copy({ shouldWatch: false }); },
watch: function() { return copy({ shouldWatch: true }); }
};

Karma#

Finally, we're ready to get our tests set up to run continually with Karma. tests.watch() triggers the following task:

'use strict';
var Server = require('karma').Server;
var path = require('path');
var gutil = require('gulp-util');
module.exports = {
watch: function() {
// Documentation: https://karma-runner.github.io/0.13/dev/public-api.html
var karmaConfig = {
configFile: path.join(__dirname, '../karma.conf.js'),
singleRun: false,
plugins: ['karma-webpack', 'karma-jasmine', 'karma-mocha-reporter', 'karma-sourcemap-loader', 'karma-phantomjs-launcher', 'karma-phantomjs-shim'], // karma-phantomjs-shim only in place until PhantomJS hits 2.0 and has function.bind
reporters: ['mocha']
};
new Server(karmaConfig, karmaCompleted).start();
function karmaCompleted(exitCode) {
gutil.log('Karma has exited with:', exitCode);
process.exit(exitCode);
}
}
};

When running in watch mode it's possible to debug the tests by going to: <a href="http://localhost:9876/">http://localhost:9876/</a>. It's also possible to run the tests standalone with a simple npm run test. Running them like this also outputs the results to an XML file in JUnit format; this can be useful for integrating into CI solutions that don't natively pick up test results.

Whichever approach we use for running tests, we use the following karma.conf.js file to configure Karma:

/* eslint-disable no-var, strict */
'use strict';
var webpackConfig = require('./webpack.config.js');
module.exports = function(config) {
// Documentation: https://karma-runner.github.io/0.13/config/configuration-file.html
config.set({
browsers: [ 'PhantomJS' ],
files: [
'test/import-babel-polyfill.js', // This ensures we have the es6 shims in place from babel
'test/**/*.tests.ts',
'test/**/*.tests.tsx'
],
port: 9876,
frameworks: [ 'jasmine', 'phantomjs-shim' ],
logLevel: config.LOG_INFO, //config.LOG_DEBUG
preprocessors: {
'test/import-babel-polyfill.js': [ 'webpack', 'sourcemap' ],
'src/**/*.{ts,tsx}': [ 'webpack', 'sourcemap' ],
'test/**/*.tests.{ts,tsx}': [ 'webpack', 'sourcemap' ]
},
webpack: {
devtool: 'eval-source-map', //'inline-source-map', - inline-source-map doesn't work at present
debug: true,
module: webpackConfig.module,
resolve: webpackConfig.resolve
},
webpackMiddleware: {
quiet: true,
stats: {
colors: true
}
},
// reporter options
mochaReporter: {
colors: {
success: 'bgGreen',
info: 'cyan',
warning: 'bgBlue',
error: 'bgRed'
}
},
junitReporter: {
outputDir: 'test-results', // results will be saved as $outputDir/$browserName.xml
outputFile: undefined, // if included, results will be saved as $outputDir/$browserName/$outputFile
suite: ''
}
});
};

As you can see, we're still using our webpack configuration from earlier to configure much of how the transpilation takes place.

And that's it; we have a workflow for developing in TypeScript using React with tests running in an automated fashion. I appreciated this has been a rather long blog post but I hope I've clarified somewhat how this all plugs together and works. Do leave a comment if you think I've missed something.

Babel 5 -> Babel 6#

This post has actually been sat waiting to be published for some time. I'd got this solution up and running with Babel 5. Then they shipped Babel 6 and (as is the way with "breaking changes") broke sourcemap support and thus torpedoed this workflow. Happily that's now been resolved. But if you should experience any wonkiness - it's worth checking that you're using the latest and greatest of Babel 6.