Sunday, 24 December 2017

ts-loader 2017 retrospective

2017 is drawing to a close, and it's been a big, big year in webpack-land. It's been a big year for ts-loader too. At the start of the year v1.3.3 was the latest version available, officially supporting webpack 1. (Old school!) We end the year with ts-loader sitting pretty at v3.2.0 and supporting webpack 2 and 3.

Many releases were shipped and that was down to a whole bunch of folk. People helped out with bug fixes, features, advice and docs improvements. All of these help. ts-loader wouldn't be where it is without you so thanks to everyone that helped out - you rock!

I'm really grateful to all of you. Thanks so much! (Apologies for those I've missed anyone out - I know there's more still.)

fork-ts-checker-webpack-plugin build speed improvements

Alongside other's direct contributions to ts-loader, other projects improved the experience of using ts-loader. Piotr OleÅ› dropped his fork-ts-checker-webpack-plugin this year which nicely increased build speed when used with ts-loader.

That opened up the possibility of adding HappyPack support. I had the good fortune to work with webpack's Tobias Koppers and ExtraHop's Alex Birmingham on improving TypeScript build speed further.

So what does the future hold?

ts-loader 4.0 (Live webpack or Die Hard)

The web marches on and webpack gallops alongside. Here's what's in the pipeline for ts-loader in 2018:

Start using the new watch API

A new watch API is being made available in the TypeScript API. We have a PR from the amazing Sheetal Nandi which adds support to ts-loader. Given that's quite a big PR we want to merge that before anything else lands. The watch API is still being finalised but once it lands in TypeScript we'll look to merge the PR and ship a new version of ts-loader.

Drop custom module resolution

Historically ts-loader has had it's own module resolution mechanism in place. We're going to look to move to use the TypeScript mechanism instead. The old module resolution be deprecated but will remain available behind a flag for a time. In future we'll look to drop the old mechanism entirely.

Drop support for TypeScript 2.3 and below

The codebase can be made simpler if we drop support for older versions of TypeScript so that's what we plan to do with our next breaking changes release.

webpack v4 is in alpha now

If any changes need to happen to ts-loader to support webpack 4 then they will be. Personally I'm planning to help out with fork-ts-checker-webpack-plugin as there will likely be some changes required there.

contextAsConfigBasePath will be replaced with a context

The option that landed in the last month doesn't quite achieve the aims of the original PR's author Christian Tinauer. Consequently it's going to be replaced with a new option. This is queued up and ready to go here.

reportFiles option to be added

Michel Rasschaert is presently working on adding a reportFiles option to ts-loader. You can see the PR in progress here.

Merry Christmas!

You can expect to see the first releases of ts-loader 4.0 in 2018. In the meantime, I'd like to wish you Merry Christmas and a Happy New Year! And once more, thanks and thanks again to all you generous people who help build ts-loader. You're wonderful and so I'm glad you do what you do... joyeux Noel!

Sunday, 19 November 2017

The TypeScript webpack PWA

So, there you sit, conflicted. You've got a lovely build setup; it's a thing of beauty. Precious, polished like a diamond, sharpened like a circular saw. There at the core of your carefully crafted setup sits webpack. Heaving, mysterious... powerful.

There's more. Not only are you sold on webpack, you're all in TypeScript too. But now you've heard tell of "Progressive Web Applications" and "Service Workers".... And you want to be dealt in. You want to build web apps that work offline. It can't work can it? Your build setup's going to be like the creature in the episode where they've taken one too many jumps and it's gone into the foetal position.

So this is the plan kids. Let's take a simple TypeScript, webpack setup and make it a PWA. Like Victoria Wood said...

Let's Do It Tonight

How to begin? Well first comes the plagiarism; here's a simple TypeScript webpack setup. Rob it. Stick a gun to its head and order it onto your hard drive. yarn install to pick up your dependencies and then yarn start to see what you've got. Something like this:

Beautiful right? And if we yarn build we end up with a simple output:

To test what we've built out we want to use a simple web server to serve up the dist folder. I've got the npm package http-server installed globally for just such an eventuality. So let's http-server ./dist and I'm once again looking at our simple app; it looks exactly the same as when I yarn start. Smashing. What would we see if we were offline? Well thanks to the magic of Chrome DevTools we can find out. Offline and refresh our browser...

Not very user friendly. Once we're done, we should be able to refresh and still see our app.

Work(box) It

Workbox is a project that makes the setting up of Service Workers (aka the magic that powers PWAs) easier. It supports webpack use cases through the workbox-webpack-plugin; so let's give it a whirl. Incidentally, there's a cracking example on the Workbox site.

yarn add workbox-webpack-plugin --dev adds the plugin to our project. To make use of it, punt your way over to the webpack.production.config.js and add an entry for the plugin. We also need to set the hash parameter of the html-webpack-plugin to be false; if it's true it'll cause problems for the ServiceWorker.


const WorkboxPlugin = require('workbox-webpack-plugin');

//...

module.exports = {

    //...

    plugins: [

        //...

        new HtmlWebpackPlugin({
            hash: false,
            inject: true,
            template: 'src/index.html',
            minify: {
                removeComments: true,
                collapseWhitespace: true,
                removeRedundantAttributes: true,
                useShortDoctype: true,
                removeEmptyAttributes: true,
                removeStyleLinkTypeAttributes: true,
                keepClosingSlash: true,
                minifyJS: true,
                minifyCSS: true,
                minifyURLs: true,
            },
        }),

        new WorkboxPlugin({
            // we want our service worker to cache the dist directory
            globDirectory: 'dist',
            // these are the sorts of files we want to cache
            globPatterns: ['**/*.{html,js,css,png,svg,jpg,gif,json}'],
            // this is where we want our ServiceWorker to be created
            swDest: path.resolve('dist', 'sw.js'),
            // these options encourage the ServiceWorkers to get in there fast 
            // and not allow any straggling "old" SWs to hang around
            clientsClaim: true,
            skipWaiting: true,
        }),
    ]

    //...
};

With this in place, yarn build will generate a ServiceWorker. Now to alter our code to register it. Open up index.tsx and add this to the end of the file:


if ('serviceWorker' in navigator) {
  window.addEventListener('load', () => {
    navigator.serviceWorker.register('/sw.js').then(registration => {
      // tslint:disable:no-console
      console.log('SW registered: ', registration);
    }).catch(registrationError => {
      console.log('SW registration failed: ', registrationError);
    });
  });
}

Put it together and...

What Have We Got?

Let's yarn build again.

Oooohh look! A service worker is with us. Does it work? Let's find out... http-server ./dist Browse to http://localhost:8080 and let's have a look at the console.

Looks very exciting. So now the test; let's go offline and refresh:

You are looking at the 200s of success. You're now running with webpack and TypeScript and you have built a Progressive Web Application. Feel good about life.

Friday, 20 October 2017

TypeScript Definitions, webpack and Module Types

A funny thing happened on the way to the registry the other day. Something changed in an npm package I was using and confusion arose. You can read my unfiltered confusion here but here's the slightly clearer explanation.

The TL;DR

When modules are imported, your loader will decide which module format it wants to use. CommonJS / AMD etc. The loader decides. It's important that the export is of the same "shape" regardless of the module format. For 2 reasons:

  1. You want to be able to reliably use the module regardless of the choice that your loader has made for which export to use.
  2. Because when it comes to writing type definition files for modules, there is support for a single external definition. Not one for each module format.

The DR

Once upon a time we decided to use big.js in our project. It's popular and my old friend Steve Ognibene apparently originally wrote the type definitions which can be found here. Then the definitions were updated by Miika Hänninen. And then there was pain.

UMD / CommonJS **and** Global exports oh my!

My usage code was as simple as this:


import * as BigJs from 'big.js';
const lookABigJs = new BigJs(1);

If you execute it in a browser it works. It makes me a Big. However the TypeScript compiler is **not** happy. No siree. Nope. It's bellowing at me:


[ts] Cannot use 'new' with an expression whose type lacks a call or construct signature.

So I think: "Huh! I guess Miika just missed something off when he updated the definition files. No bother. I'll fix it." I take a look at how big.js exposes itself to the outside world. At the time, thusly:


    //AMD.
    if (typeof define === 'function' && define.amd) {
        define(function () {
            return Big;
        });
        
    // Node and other CommonJS-like environments that support module.exports.
    } else if (typeof module !== 'undefined' && module.exports) {
        module.exports = Big;
        module.exports.Big = Big;
    //Browser.
    } else {
        global.Big = Big;
    }

Now, we were using webpack as our script bundler / loader. webpack is supersmart; it can take all kinds of module formats. So although it's more famous for supporting CommonJS, it can roll with AMD. That's exactly what's happening here. When webpack encounters the above code, it goes with the AMD export. So at runtime, import * as BigJs from 'big.js'; lands up resolving to the return Big; above.

Now this turns out to be super-relevant. I took a look at the relevant portion of the definition file and found this:


export const Big: BigConstructor;

Which tells me that Big is being exported as a subproperty of the module. That makes sense; that lines up with the module.exports.Big = Big; statement in the the big.js source code. There's a "gotcha" coming; can you guess what it is?

The problem is that our type definition is not exposing Big as a default export. So even though it's there; TypeScript won't let us use it. What's killing us further is that webpack is loading the AMD export which doesn't have Big as a subproperty of the module. It only has it as a default.

Kitson Kelly expressed the problem well when he said:

there is a different shape depending on which loader is being used and I am not sure that makes a huge amount of sense. The AMD shape is different than the CommonJS shape. While that is technically possible, that feels like that is an issue.

One Definition to Rule Them All

He's right; it is an issue. From a TypeScript perspective there is no way to write a definition file that allows for different module "shapes" depending upon the module type. If you really wanted to do that you're reduced to writing multiple definition files. That's blind alley anyway; what you want is a module to expose itself with the same "shape" regardless of the module type. What you want is this:

AMD === CommonJS === Global

And that's what we now have! Thanks to Michael McLaughlin, author of big.js, version 4.0 unified the export shape of the package. Miika Hänninen submitted another PR which fixed up the type definitions. And once again the world is a beautiful place!

Thursday, 19 October 2017

Working with Extrahop on webpack and ts-loader

I'm quite proud of this: https://www.extrahop.com/company/blog/2017/extrahop-webpack-accelerating-build-times/

If you didn't know, I spend a good amount of my spare time hacking on open source software. You may not know what that is. I would describe OSS as software made with ❤ by people, for other people to use.

You are currently reading this on a platform that was built using OSS. It's all around you, every day. It's on your phone, on your computer, on your TV. It's everywhere.

It's my hobby, it's part of my work. This specifically was one of those tremendously rare occasions when I got paid directly to work on my hobby, with people much brighter than me. It was brilliant. I loved it; it was a privilege.

Here's to Open Source!

Tuesday, 12 September 2017

fork-ts-checker-webpack-plugin code clickability

My name is John Reilly and I'm a VS Code addict. There I said it. I'm also a big fan of TypeScript and webpack. I've recently switched to using the awesome fork-ts-checker-webpack-plugin to speed up my builds.

One thing I love is using VS Code both as my editor and my terminal. Using the fork-ts-checker-webpack-plugin I noticed a problem when TypeScript errors showed up in the terminal:

Take a look at the red file location in the console above. What's probably not obvious from the above screenshot is that it is not clickable. I'm used to being able to click on link in the console and bounce straight to the error location. It's a really productive workflow; see a problem, click on it, be taken to the cause, fix it.

I want to click on "C:/source/ts-loader/examples/fork-ts-checker/src/fileWithError.ts(2,7)" and have VS Code open up fileWithError.ts, ideally at line 2 and column 7. But here it's not working. Why?

Well, I initially got this slightly wrong; I thought it was about the formatting of the file path. It is. I thought that having the line number and column number in parentheses after the path (eg "(2,7)") was screwing over VS Code. It isn't. Something else is. Look closely at the screenshot; what do you see? Do you notice how the colour of the line number / column number is different to the path? In the words of Delbert Wilkins: that's crucial.

Yup, the colour change between the path and the line number / column number is the problem. I've submitted a PR to fix this that I hope will get merged. In the meantime you can avoid this issue by dropping this code into your webpack.config.js:


var chalk = require("chalk");
var os = require("os");

function clickableFormatter(message, useColors) {
    var colors = new chalk.constructor({ enabled: useColors });
    var messageColor = message.isWarningSeverity() ? colors.bold.yellow : colors.bold.red;
    var fileAndNumberColor = colors.bold.cyan;
    var codeColor = colors.grey;
    return [
        messageColor(message.getSeverity().toUpperCase() + " in ") +
        fileAndNumberColor(message.getFile() + "(" + message.getLine() + "," + message.getCharacter() + ")") +
        messageColor(':'),

        codeColor(message.getFormattedCode() + ': ') + message.getContent()
    ].join(os.EOL);
};

module.exports = {
    // Other config...
    module: {
        rules: [
            {
                test: /\.tsx?$/,
                loader: 'ts-loader',
                options: { transpileOnly: true }
            }
        ]
    },
    resolve: {
        extensions: [ '.ts', '.tsx', 'js' ]
    },
    plugins: [
        new ForkTsCheckerWebpackPlugin({ formatter: clickableFormatter }) // Here we get our clickability back
    ]
};

With that in place, what do you we have? This:

VS Code clickability; it's a beautiful thing.

Thursday, 7 September 2017

TypeScript + Webpack: Super Pursuit Mode

This post also featured as a webpack Medium publication.

If you're like me then you'll like TypeScript and you'll like module bundling with webpack. You may also like speedy builds. That's completely understandable. The fact of the matter is, you sacrifice a bit of build speed to have webpack in the mix. Wouldn't it be great if we could even up the difference?

I'm the primary maintainer of ts-loader, a TypeScript loader for webpack. Just recently a couple of PRs were submitted that said, in other words: ts-loader is like this:

But it could be like this:

Apologies for the image quality above; there appear to be no high quality pictures out there of KITT in Super Pursuit Mode for me to defame with Garan Jenkin's atrocious puns.

fork-ts-checker-webpack-plugin

"Faster type checking with forked process" read the enticing name of the issue. It turned out to be Piotr OleÅ› (@OlesDev) telling the world about his beautiful creation. He'd put together a mighty fine plugin that can be used alongside ts-loader called the fork-ts-checker-webpack-plugin. The name is a bit of a mouthful but the purpose is mouth-watering. To quote the README, it is a:

Webpack plugin that runs typescript type checker on a separate process.

What does this mean and how does this fit with ts-loader? Well, ts-loader does 2 jobs:

  1. It transpiles your TypeScript into JavaScript and hands it off to webpack
  2. It collects any TypeScript compilation errors and reports them to webpack

What this plugin does is say, "forget about #2 - we've got this." It removes the responsibility for type checking from ts-loader, so the only work ts-loader does is transpilation. In the meantime, the all important type checking is still happening. To be honest, there would be little reason to recommend this approach otherwise. The difference is fork-ts-checker-webpack-plugin is doing the heavy lifting in a separate process. This provides a nice performance boost to your workflow. ts-loader is doing less and that's a good thing.

The approach used here is similar to that employed by awesome-typescript-loader. ATL is another TypeScript loader for webpack by the excellent Stanislav Panferov. ATL also has a technique for performing typechecking in a forked process. fork-ts-checker-webpack-plugin was an effort by Piotr to implement something similar but with improved incremental build performance.

How do we use it? Add fork-ts-checker-webpack-plugin as a devDependency of your project and then amend the webpack.config.js to set ts-loader into transpileOnly mode and drop the plugin into the mix:


var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');

var webpackConfig = {
  // other config...
  context: __dirname, // to automatically find tsconfig.json
  module: {
    rules: [
      {
        test: /\.tsx?$/,
        loader: 'ts-loader',
        options: {
          // disable type checker - we will use it in fork plugin
          transpileOnly: true 
        }
      }
    ]
  },
  plugins: [
    new ForkTsCheckerWebpackPlugin()
  ]
};

If you'd like to see an example of how to use the plugin then take a look at a simple example and a more involved one.

HappyPack

Not so long ago I didn't know what happyness HappyPack was. "Happiness in the form of faster webpack build times." That's what it is.

HappyPack makes webpack builds faster by allowing you to transform multiple files in parallel.

It does this by spinning up multiple threads, each with their own loaders inside. We wanted to do this with ts-loader; to have multiple instances of ts-loader running. Work can then be divided up across these separate loaders. Isn't multi-threading great?

ts-loader did not initially play nicely with HappyPack; essentially this is because ts-loader touches parts of webpack's API that HappyPack replaces. The entirely wonderful Artem Kozlov submitted a PR which added HappyPack support to ts-loader. Support essentially amounts to switching ts-loader to run in transpileOnly mode and ensuring that there is no attempt to talk to parts of the webpack API that HappyPack removes.

It would be hard to recommend using HappyPack as is because, as with transpileOnly mode you lose all typechecking. Where it becomes worthwhile is where it is combined with the fork-ts-checker-webpack-plugin so you keep the typechecking.

Enough with the chitter chatter; how can we achieve this? Add HappyPack as a devDependency of your project and then amend the webpack.config.js as follows:


var HappyPack = require('happypack');
var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');

module.exports = {
    // other config...
    context: __dirname, // to automatically find tsconfig.json
    module: {
        rules: [
            {
                test: /\.tsx?$/,
                exclude: /node_modules/,
                loader: 'happypack/loader?id=ts'
            }
        ]
    },
    plugins: [
        new HappyPack({
            id: 'ts',
            threads: 2,
            loaders: [
                {
                    path: 'ts-loader',
                    query: { happyPackMode: true }
                }
            ]
        }),
        new ForkTsCheckerWebpackPlugin({ checkSyntacticErrors: true })
    ]
};

Note that the ts-loader options are now configured via the HappyPack query and that we're setting ts-loader with the happyPackMode option set.

There's one other thing to note which is important; we're now passing the checkSyntacticErrors option to the fork plugin. This ensures that the plugin checks for both syntactic errors (eg const array = [{} {}];) and semantic errors (eg const x: number = '1';). By default the plugin only checks for semantic errors. This is because when ts-loader is used with transpileOnly set, ts-loader will still report syntactic errors. But when used in happyPackMode it does not.

If you'd like to see an example of how to use HappyPack then once again we have a simple example and a more involved one.

thread-loader + cache-loader

You might have some reservations about using HappyPack. First of all the quirky configuration required makes your webpack config rather less comprehensible. Also, HappyPack is not officially blessed by webpack. It is a side project developed externally from webpack and there's no guarantees that new versions of webpack won't break it. Neither of these are reasons not to use HappyPack but they are things to bear in mind.

What if there were a way to parallelise our builds which dealt with these issues? Well, there is! By using thread-loader and cache-loader in combination you can both feel happy that you're using an official webpack workflow and you can have a config that's less confusing.

What would that config look like? This:


var ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin');

module.exports = {
    // other config...
    context: __dirname, // to automatically find tsconfig.json
    module: {
        rules: {
            test: /\.tsx?$/,
            use: [
                { loader: 'cache-loader' },
                {
                    loader: 'thread-loader',
                    options: {
                        // there should be 1 cpu for the fork-ts-checker-webpack-plugin
                        workers: require('os').cpus().length - 1,
                    },
                },
                {
                    loader: 'ts-loader',
                    options: {
                        happyPackMode: true // IMPORTANT! use happyPackMode mode to speed-up compilation and reduce errors reported to webpack
                    }
                }
            ]
        }
    },
    plugins: [
        new ForkTsCheckerWebpackPlugin({ checkSyntacticErrors: true })
    ]
};

As you can see the configuration is much cleaner than with HappyPack. Interestingly ts-loader still needs to run in "happyPackMode" and that's because thread-loader is essentially behaving in the same fashion as with HappyPack and so ts-loader needs to behave in the same way. Probably ts-loader should have a more generic flag name than "happyPackMode". (Famously, naming things is hard; so if you've a good idea, tell me!)

These loaders are new and so tread carefully. My own experiences have been pretty positive but your mileage may vary. Do note that, as with HappyPack, the thread-loader is highly configurable.

If you'd like to see an example of how to use thread-loader and cache-loader then once again we have a simple example and a more involved one.

All This Could Be Yours...

In this post we're improving build speeds with TypeScript and webpack in 3 ways:

fork-ts-checker-webpack-plugin
With this plugin in play ts-loader only performs transpilation. ts-loader is doing less so the build is faster.
HappyPack
With HappyPack in the mix, the build is parallelised. That parallelisation means the build is faster.
thread-loader / cache-loader
With thread-loader and cache-loader, again the build is parallelised and the build is faster.

Wednesday, 30 August 2017

Oh the Glamour of Open Source

Here's how my life panned out in the early hours of Wednesday 30th September 2017:

2 am
awoken by Lisette having a nightmare
3 am
gave up hope of getting back to sleep upstairs and headed for the sofa
4 am
still not asleep and discovered a serious gap in an open source project I help out with
4:30 am
come up with idea for a fix
4:45 am
accidentally delete a repo that I and many others care about from GitHub
4:50 am
recover said repo from backups (sweet mercy how could I be so stupid?)
4:55 am
actually succeed in cloning the repo I want to hack on
5:30 am
implement fix and send PR
5:35 am
go for a walk round the river
6:30 am
realise I didn't submit a test for the changed functionality
6:35 am
write test only to discover I can't run the test pack on Windows
6:40 am
add test to PR anyway so I can see test results when Travis runs on each commit.
7 am
despair at the duration of my feedback loop, totally fail to get my tests to pass
7:10 am
stub my toe really badly on a train set Benjamin has been busily assembling beneath my feet
7:11 am
give in and literally beg the project owner in Paris to fix the tests for me. He takes pity on me and agrees. Possibly because I gave him emoji tulips 🌷
7:12 am
feel like a slight failure and profoundly tired.

Oh the glamour of open source.

Sunday, 27 August 2017

Karma: From PhantomJS to Headless Chrome

Like pretty much everyone else I've been using PhantomJS to run my JavaScript (or compiled-to-JS) unit tests. It's been great. So when I heard the news that PhantomJS was dead I was genuinely sad. However, the King is dead.... Long live the King! For there is a new hope; it's called Chrome Headless . It's not a separate version of Chrome; rather the ability to run Chrome without a UI is now baked into Google's favourite browser as of v59. (For those history buffs I might as well be clear: the main reason PhantomJS died is because Chrome Headless was in the works.)

Making the Switch

As long as you're running Chrome v59 or greater then you can switch. I've just made ts-loader's execution test pack run with Chrome Headless instead of PhantomJS and I've rarely been happier. Honest. Some context: the execution test pack runs Jasmine unit tests via the Karma test runner. The move was surprisingly easy and you can see just how minimal it was in the PR here. If you want to migrate a test that runs tests via Karma then this will take you through what you need to do.

package.json

You no longer need phantomjs-prebuilt as a dev dependency of your project. That's the PhantomJS browser disappearing in the rear view mirror. Next we need to replace karma-phantomjs-launcher with karma-chrome-launcher. These packages are responsible for firing up the browser that the tests are run in and we no longer want to invoke PhantomJS; we're Chrome all the way baby.

karma.conf.js

You need to tell Karma to use Chrome Headless instead of PhantomJS. You do that by replacing


   browsers: [ 'PhantomJS' ],

with


   browsers: [ 'ChromeHeadless' ],

That's it; job done!

Continuous Integration

There's always one more thing isn't there? Yup, ts-loader has CI builds that run on Windows with AppVeyor and Linux with Travis. The AppVeyor build went green on the first run; that's because Chrome is installed by default in the AppVeyor build environment. (yay!)

Travis went red. (boooo!) Travis doesn't have Chrome installed by default. But it's no biggie; you just need to tweak your .travis.yml like so:


dist: trusty
addons:
  chrome: stable

This includes Chrome in the Travis build environment. Green. Boom!

Saturday, 29 July 2017

Sunday, 2 July 2017

Dynamic import: I've been awaiting you...

One of the most exciting features to ship with TypeScript 2.4 was support for the dynamic import expression. To quote the release blog post:

Dynamic import expressions are a new feature in ECMAScript that allows you to asynchronously request a module at any arbitrary point in your program. These modules come back as Promises of the module itself, and can be await-ed in an async function, or can be given a callback with .then.

...

Many bundlers have support for automatically splitting output bundles (a.k.a. “code splitting”) based on these import() expressions, so consider using this new feature with the esnext module target. Note that this feature won’t work with the es2015 module target, since the feature is anticipated for ES2018 or later.

As the post makes clear, this adds support for a very bleeding edge ECMAScript feature. This is not fully standardised yet; it's currently at stage 3 on the TC39 proposals list. That means it's at the Candidate stage and is unlikely to change further. If you'd like to read more about it then take a look at the official proposal here.

Whilst this is super-new, we are still able to use this feature. We just have to jump through a few hoops first.

TypeScript Setup

First of all, you need to install TypeScript 2.4. With that in place you need to make some adjustments to your tsconfig.json in order that the relevant compiler switches are flipped. What do you need? First of all you need to be targeting ECMAScript 2015 as a minimum. That's important specifically because ES2015 contained Promises which is what dynamic imports produce. The second thing you need is to target the module type of esnext. You're likely targeting es2015 now, esnext is that plus dynamic imports.

Here's a tsconfig.json I made earlier which has the relevant settings set:


{
    "compilerOptions": {
        "allowSyntheticDefaultImports": true,
        "lib": [
            "dom",
            "es2015"
        ],
        "target": "es2015",
        "module": "esnext",
        "moduleResolution": "node",
        "noImplicitAny": true,
        "noUnusedLocals": true,
        "noUnusedParameters": true,
        "removeComments": false,
        "preserveConstEnums": true,
        "sourceMap": true,
        "skipLibCheck": true
    }
}

Babel Setup

At the time of writing, browser support for dynamic import is non-existent. This will likely be the case for some time but it needn't hold us back. Babel can step in here and compile our super-new JS into JS that will run in our browsers today.

You'll need to decide for yourself how much you want Babel to do for you. In my case I'm targeting old school browsers which don't yet support ES2015. You may not need to. However, the one thing that you'll certainly need is the Syntax Dynamic Import plugin. It's this that allows Babel to process dynamic import statements.

These are the options I'm passing to Babel:


var babelOptions = {
  "plugins": ["syntax-dynamic-import"],
  "presets": [
    [
      "es2015",
      {
        "modules": false
      }
    ]
  ]
};

You're also going to need something that actually execute the imports. In my case I'm using webpack...

webpack

webpack 2 supports import(). So if you webpack set up with ts-loader (or awesome-typescript-loader etc), chaining into babel-loader you should find you have a setup that supports dynamic import. That means a webpack.config.js that looks something like this:


var path = require('path');
var webpack = require('webpack');

var babelOptions = {
  "plugins": ["syntax-dynamic-import"],
  "presets": [
    [
      "es2015",
      {
        "modules": false
      }
    ]
  ]
};

module.exports = {
  entry: './app.ts',
  output: {
      filename: 'bundle.js'
  },
  module: {
    rules: [{
      test: /\.ts(x?)$/,
      exclude: /node_modules/,
      use: [
        {
          loader: 'babel-loader',
          options: babelOptions
        },
        {
          loader: 'ts-loader'
        }
      ]
    }, {
      test: /\.js$/,
      exclude: /node_modules/,
      use: [
        {
          loader: 'babel-loader',
          options: babelOptions
        }
      ]
    }]
  },
  resolve: {
    extensions: ['.ts', '.tsx', '.js']
  },
};

ts-loader example

I'm one of the maintainers of ts-loader which is a TypeScript loader for webpack. When support for dynamic imports landed I wanted to add a test to cover usage of the new syntax with ts-loader.

We have 2 test packs for ts-loader, one of which is our "execution" test pack. It is so named because it works by spinning up webpack with ts-loader and then using karma to execute a set of tests. Each "test" in our execution test pack is actually a mini-project with its own test suite (generally jasmine but that's entirely configurabe). Each complete with its own webpack.config.js, karma.conf.js and either a typings.json or package.json for bringing in dependencies. So it's a full test of whether code slung with ts-loader and webpack actually executes when the output is plugged into a browser.

This is the test pack for dynamic imports:


import a from "../src/a";
import b from "../src/b";

describe("app", () => {
  it("a to be 'a' and b to be 'b' (classic)", () => {
    expect(a).toBe("a");
    expect(b).toBe("b");
  });

  it("import results in a module with a default export", done => {
    import("../src/c").then(c => {
      // .default is the default export
      expect(c.default).toBe("c");

      done();
    }
  });

  it("import results in a module with an export", done => {
    import("../src/d").then(d => {
      // .default is the default export
      expect(d.d).toBe("d");

      done();
    }
  });

  it("await import results in a module with a default export", async done => {
    const c = await import("../src/c");

    // .default is the default export
    expect(c.default).toBe("c");

    done();
  });

  it("await import results in a module with an export", async done => {
    const d = await import("../src/d");

    expect(d.d).toBe("d");

    done();
  });
});

As you can see, it's possible to use the dynamic import as a Promise directly. Alternatively, it's possible to consume the imported module using TypeScripts support for async / await. For my money the latter option makes for much clearer code.

If you're looking for a complete example of how to use the new syntax then you could do worse than taking the existing test pack and tweaking it to your own ends. The only change you'd need to make is to strip out the resolveLoader statements in webpack.config.js and karma.conf.js. (They exist to lock the test in case to the freshly built ts-loader stored locally. You'll not need this.)

You can find the test in question here. Happy code splitting!

Sunday, 11 June 2017

Windows Defender Step Away From npm

Update 18/06/2017

Whilst things did improve by fiddling with Windows Defender it wasn't a 100% fix which makes me wary. Interestingly, VS Code was always open when I did experience the issue and I haven't experienced it when it's been closed. So it may be the cause. I've opened an issue for this against the VS Code repo - it sounds like other people may be affected as I was. Perhaps this is VS Code and not Windows Defender. Watch that space...

Update 12/07/2017

The issue was VS Code. The bug has now been fixed and shipped last night with VS Code 1.14.0. Yay!


I've recently experienced many of my npm installs failing for no consistent reason. The error message would generally be something along the lines of:


npm ERR! Error: EPERM: operation not permitted, rename 'C:\dev\training\drrug\node_modules\.staging\@exponent\ngrok-fc327f2a' -> 'C:\dev\training\drrug\node_modules\@exponent\ngrok'

I spent a good deal of time changing the versions of node and npm I was running; all seemingly to no avail. Regular flakiness which I ascribed to node / npm. I was starting to give up when I read of other people experiencing similar issues. Encouragingly Fernando Meira suggested a solution:

I got the same problem just doing an npm install. Run with antivirus disabled (if you use Windows Defender, turn off Real-Time protection and Cloud-based protection). That worked for me!

I didn't really expect this to work - Windows Defender has been running in the background of my Windows 10 laptop since I've had it. There's been no problems with npm installs up until a week or so ago. But given the experience I and others have had I thought I should put it out there: it looks like Windows Defender has it in for npm. Go figure.

Alas Windows Defender doesn't stay dead for long; it's like a zombie that rises from the grave no matter how many times you kill it. So you might want to try configuring it to ignore node.exe:

Or switching to Linux...

Saturday, 20 May 2017

TypeScript: Spare the Rod, Spoil the Code

I've recently started a new role. Perhaps unsurprisingly, part of the technology stack is TypeScript. A couple of days into the new codebase I found a bug. Well, I say I found a bug, TypeScript and VS Code found the bug - I just let everyone else know.

The flexibility that TypeScript offers in terms of compiler settings is second to none. You can turn up the dial of strictness to your hearts content. Or down. I'm an "up" man myself.

The project that I am working on has the dial set fairly low; it's pretty much using the default compiler values which are (sensibly) not too strict. I have to say this makes sense for helping people get on board with using TypeScript. Start from a point of low strictness and turn it up when you're ready. As you might have guessed, I cranked the dial up on day one on my own machine. I should say that as I did this, I didn't foist this on the project at large - I kept it just to my build... I'm not *that* guy!

I made the below changes to the tsconfig.json file. Details of what each of these settings does can be found in the documentation here.


    "noImplicitAny": true,
    "noImplicitThis": true,
    "noUnusedLocals": true,
    "noImplicitReturns": true,
    "noUnusedParameters": true,

I said I found a bug. The nature of the bug was an unused variable; a variable was created in a function but then not used. Here's a super simple example:


function sayHi(name: string) {
    const greeting = `Hi ${ name }`;
    return name;
}

It's an easy mistake to make. I've made this mistake before myself. But with the noUnusedLocals compiler setting in place it's now an easy mistake to catch; VS Code lets you know loud and clear:

The other compiler settings will similarly highlight simple mistakes it's possible to make and I'd recommend using them. I should say I've written this from the perspective of a VS Code user, but this really applies generally to TypeScript usage. So whether you're an alm.tools guy, a WebStorm gal or something else entirely then this too can be yours!

I'd also say that the strictNullChecks compiler setting is worth looking into. However, switching an already established project to using that can involve fairly extensive code changes and will also require a certain amount of education of, and buy in from, your team. So whilst I'd recommend it too, I'd save that one until last.

Tuesday, 25 April 2017

Setting Build Version Using AppVeyor and ASP.Net Core

AppVeyor has support for setting the version of a binary during a build. However - this deals with the classic ASP.Net world of AssemblyInfo. I didn't find any reference to support for doing the same with dot net core. Remember, dot net core relies upon a <Version> or a <VersionPrefix> setting in the .csproj file. Personally, <Version> is my jam.

However, coming up with your own bit of powershell that stamps the version during the build is a doddle; here we go:


Param($projectFile, $buildNum)

$content = [IO.File]::ReadAllText($projectFile)

$regex = new-object System.Text.RegularExpressions.Regex ('()([\d]+.[\d]+.[\d]+)(.[\d]+)(<\/Version>)', 
         [System.Text.RegularExpressions.RegexOptions]::MultiLine)

$version = $null
$match = $regex.Match($content)
if($match.Success) {
    # from "1.0.0.0" this will extract "1.0.0"
    $version = $match.groups[2].value
}

# suffix build number onto $version. eg "1.0.0.15"
$version = "$version.$buildNum"

# update "1.0.0.0" to "$version"
$content = $regex.Replace($content, '${1}' + $version + '${4}')

# update csproj file
[IO.File]::WriteAllText($projectFile, $content)

# update AppVeyor build
Update-AppveyorBuild -Version $version

You can invoke this script as part of the build process in AppVeyor by adding something like this to your appveyor.yml.


before_build:
- ps: .\ModifyVersion.ps1 $env:APPVEYOR_BUILD_FOLDER\src\Proverb.Web\Proverb.Web.csproj $env:APPVEYOR_BUILD_NUMBER

It will keep the first 3 parts of the version in your .csproj (eg "1.0.0") and suffix on the build number supplied by AppVeyor.

Thursday, 30 March 2017

I'm looking for work!

My name is John Reilly. I'm a full stack developer based in London, UK. I'm just coming to the end of a contract (due to finish in April 2017) and I'm starting to look for my next role.

I have more than 15 years experience developing software commercially. I've worked in a number of industries including telecoms, advertising, technology (I worked at Microsoft for a time) and, of course, finance. The bulk of my experience is in the finance sector. I've provided consultancy services, building and maintaining applications for both large and small companies; from enterprise to startup.

My most recent work has been full stack web work; using React on the front end and SignalR (ASP.Net) on the back end. I'm pragmatic about the tools that I use to deliver software solutions and not tied to any particular technology. That said, I've gravitated towards the handiwork of Anders Hejlsberg; starting out with Delphi and being both an early C# and TypeScript adopter. I've built everything from high volume trade feeds with no UI beyond a log file, WinForms apps for call centres, to fully fledged rich web applications with a heavy emphasis on UX.

I enjoy the challenges of understanding problems and coming up with useful solutions to them. I'm thrilled when something I've built makes someone's life easier. I love to learn and to share my knowledge; both in person and also through writing this blog. (This is the first time I've used a post to seek work.)

In my spare time I'm involved with various open source projects including ts-loader and DefinitelyTyped (member of the core team). Get in contact with me if you're interested in learning more about me. Mail me at johnny_reilly@hotmail.com and I can provide you with a CV. You can also find me on GitHub.

Update 25/04/2016: Position Filled

I'm happy to say that I've lined up work for the next 6 months or so. Once again I'll be working in the financial services industry with one interesting twist. In a blog post ages ago I bet that native apps would start to be replaced with SPAs. This has started to happen. I've started to see companies taking a "web-first-and-only" approach to building apps. In that vein, that's exactly what I'm off to build.

As a result of publishing this blog post I've had some interesting conversations with companies and got to think hard about the direction the industry is taking. I remain excited by JavaScript / TypeScript and React. I'm hopeful of the possibilities offered by the container world of Docker etc. I'm enjoying .NET Core and have very high hopes for it. I remain curious about Web Assembly.

Before I sign off, I know at some point I'll be looking for work once again. If there's a system you'd like built, if there's some mentoring and training you'd like done or if you'd just like to have a conversation I'm always available to talk. Drop me a line at johnny_reilly@hotmail.com.

Tuesday, 28 March 2017

Debugging ASP.Net Core in VS or Code

I've been using Visual Studio for a long time. Very good it is too. However, it is heavyweight; it does far more than I need. What I really want when I'm working is a fast snappy editor, with intellisense and debugging. What I've basically described is VS Code. It rocks and has long become my go-to editor for TypeScript.

Since I'm a big C# fan as well I was delighted that editing C# was also possible in Code. What I want now is to be able to debug ASP.Net Core in Visual Studio OR VS Code. Can it be done? Let's see....

I fire up Visual Studio and File -> New Project (yes it's a verb now). Select .NET Core and then ASP.Net Core Web Application. OK. We'll go for a Web Application. Let's not bother with authentication. OK. Wait a couple of seconds and Visual Studio serves up a new project. Hit F5 and we're debugging in Visual Studio.

So far, so straightforward. What will VS Code make of this?

I cd my way to the root of my new ASP.Net Core Web Application and type the magical phrase "code .". Up it fires. I feel lucky, let's hit "F5". Huh, a dropdown shows up saying "Select Environment" and offering me the options of Chrome and Node. Neither do I want. It's about this time I remember this is a clean install of VS Code and doesn't yet have the C# extension installed. In fact, if I open a C# file it up it tells me and recommends that I install. Well that's nice. I take it up on the kind offer; install and reload.

When it comes back up I see the following entries in the "output" tab:


Updating C# dependencies...
Platform: win32, x86_64 (win7-x64)

Downloading package 'OmniSharp (.NET 4.6 / x64)' (20447 KB) .................... Done!
Downloading package '.NET Core Debugger (Windows / x64)' (39685 KB) .................... Done!

Installing package 'OmniSharp (.NET 4.6 / x64)'
Installing package '.NET Core Debugger (Windows / x64)'

Finished

Note that mention of "debugger" there? Sounds super-promising. There's also some prompts: "There are unresolved dependencies from 'WebApplication1/WebApplication1.csproj'. Please execute the restore command to continue"

So it wants me to dotnet restore. It's even offering to do that for me! Have at you; I let it.


Welcome to .NET Core!
---------------------
Learn more about .NET Core @ https://aka.ms/dotnet-docs. Use dotnet --help to see available commands or go to https://aka.ms/dotnet-cli-docs.

Telemetry
--------------
The .NET Core tools collect usage data in order to improve your experience. The data is anonymous and does not include command-line arguments. The data is collected by Microsoft and shared with the community.
You can opt out of telemetry by setting a DOTNET_CLI_TELEMETRY_OPTOUT environment variable to 1 using your favorite shell.
You can read more about .NET Core tools telemetry @ https://aka.ms/dotnet-cli-telemetry.

Configuring...
-------------------
A command is running to initially populate your local package cache, to improve restore speed and enable offline access. This command will take up to a minute to complete and will only happen once.
Decompressing Decompressing 100% 4026 ms
Expanding 100% 34814 ms
  Restoring packages for c:\Source\Debugging\WebApplication1\WebApplication1\WebApplication1.csproj...
  Restoring packages for c:\Source\Debugging\WebApplication1\WebApplication1\WebApplication1.csproj...
  Restore completed in 734.05 ms for c:\Source\Debugging\WebApplication1\WebApplication1\WebApplication1.csproj.
  Generating MSBuild file c:\Source\Debugging\WebApplication1\WebApplication1\obj\WebApplication1.csproj.nuget.g.props.
  Writing lock file to disk. Path: c:\Source\Debugging\WebApplication1\WebApplication1\obj\project.assets.json
  Restore completed in 1.26 sec for c:\Source\Debugging\WebApplication1\WebApplication1\WebApplication1.csproj.
  
  NuGet Config files used:
      C:\Users\johnr\AppData\Roaming\NuGet\NuGet.Config
      C:\Program Files (x86)\NuGet\Config\Microsoft.VisualStudio.Offline.config
  
  Feeds used:
      https://api.nuget.org/v3/index.json
      C:\Program Files (x86)\Microsoft SDKs\NuGetPackages\
Done: 0.

The other prompt says "Required assets to build and debug are missing from 'WebApplication1'. Add them?". This also sounds very promising and I give it the nod. This creates a .vscode directory and 2 enclosed files; launch.json and tasks.json.

So lets try that F5 thing again... http://localhost:5000/ is now serving the same app. That looks pretty good. So lets add a breakpoint to the HomeController and see if we can hit it:

Well I can certainly add a breakpoint but all those red squigglies are unnerving me. Let's clean the slate. If you want to simply do that in VS Code hold down CTRL+SHIFT+P and then type "reload". Pick "Reload window". A couple of seconds later we're back in and Code is looking much happier. Can we hit our breakpoint?

Yes we can! So you're free to develop in either Code or VS; the choice is yours. I think that's pretty awesome - and well done to all the peeople behind Code who've made this a pretty seamless experience!

Thursday, 23 February 2017

Under the Duck: An Afternoon in Open Source

Have you ever wondered what happens behind the scenes of open source projects? One that I'm involved with is ts-loader; a TypeScript loader for webpack. Yesterday was an interesting day in the life of ts-loader and webpack; things unexpectedly broke. Oh and don't worry, they're fixed now.

How things panned out reflects well on the webpack community. I thought it might be instructive to take a look at the legs furiously paddling underneath the duck of open source. What follows is a minute by minute account of my life on the afternoon of Wednesday 22nd February 2017:

3:55pm
I'm sat at my desk in the City of London. I have to leave at 4pm to go to the dentist. I'm working away on a project which is built and bundled using ts-loader and webpack. However, having just npm installed and tried to spin up webpack in watch mode, I discover that everything is broken. Watch mode is not working - there's an error being thrown in ts-loader. It's to do with a webpack property called mtimes. ts-loader depends upon it and it looks like it is no longer always passed through. Go figure.
4:01pm

I've got to go. I'm 15 minutes from Bank station. So, I grab my bag and scarper out the door. On my phone I notice an issue has been raised - other people are being affected by the problem too. As I trot down the various alleys that lead to the station I wonder whether I can work around this issue. Using GitHub to fork, edit code and submit a PR on a mobile phone is possible. Just. But it's certainly not easy...

My PR is in, the various test packs are starting to execute somewhere out there in Travis and Appveyor-land. Then I notice Ed Bishop has submitted a near identical PR. Yay Ed! I'm always keen to encourage people to contribute and so I intend to merge that PR rather than my own.

16:12

Rubbish. The Waterloo and City Line is out of action. I need to get across London to reach Waterloo or I'll miss my appointment. It's time to start running....

16:15

It's rather nagging at me that behaviour has changed without warning. This has been reliably in place the entire time I've been involved with ts-loader / webpack. Why now? I don't see any obvious mentions on the webpack GitHub repo. So I head over to the webpack Slack channel and ask: (conversation slightly abridged)

johnny_reilly

Hey all, has something happened to mtimes? Behaviour seems to have changed - now undefined occasionally during watch mode. A PR has been raised against ts-loader to work around this https://github.com/TypeStrong/ts-loader/pull/480#issuecomment-281714600

However I'm wondering if this should actually be merged given behaviour has changed unexpectedly

sokra

ah...

i removed it. I thought it was unused.

johnny_reilly

It's definitely not!

sokra

it's not in the public API^^

Any reason why you are not using getTimes()?

...

johnny_reilly
Okay, I'm on a train and won't be near a computer for a while. ts-loader is presently broken because it depends on mtimes. Would it be possible for you to add this back at least for now. I'm aware many people depend on ts-loader and are now broken.
sokra

sure, I readd it but deprecate it.

...

sean.larkin
@sokra is this the change you just made for that watchpack bug fix? Or unlrelated, just wanted to track if I didn't already have the change/issue
sokra

https://github.com/webpack/watchpack/pull/48

johnny_reilly

This is what the present code does:


const watcher = watching.compiler.watchFileSystem.watcher || 
                watching.compiler.watchFileSystem.wfs.watcher

And then .mtimes

Should I be able to do .getTimes() instead?

sokra

actually you can't rely on watchFileSystem being NodeJsWatchFileSystem. But this is another topic

...

but yes

johnny_reilly

Thanks @sokra - when I get to a keyboard I'll swap mtimes for getTimes() and report back.

17:28

Despite various trains being out of action / missing in action I've made it to the dentists; phew! I go in for my checkup and plan to take a look at the issue later that evening. In the meantime I've hoping that Tobias (Sokra) will get chance to republish so that ts-loader users aren't too impacted.

18:00

Done at the dentist and I'm heading home. Whilst I've been opening wide and squinting at the ceiling, TypeScript 2.2 has shipped. Whilst this is super exciting, according to Greenkeeper, the new version has broken the build. Arrrrghhhh...

I start to look into this and realise we're not broken because of TypeScript 2.2; we were broken because of the mtimes. Tobias has now re-added mtimes and published. With that in place I requeue a build and.... drum roll.... we're green!

The good news just keeps on coming as Luka Zakrajšek has submitted a PR which uses getTimes() in place of mtimes. And the tests pass. Awesome! MERGE. I just need to cut a release and we're done.

18:15

I'm home. My youngest son has been suffering from chicken pox all week and as a result my wife has been in isolation, taking care of him. We chat whilst the boys watch Paw Patrol as the bath runs. I flick open the laptop and start doing the various housekeeping tasks around cutting a release. This is interrupted by various bathtime / bedtime activities and I abandon work for now.

19:30

The boys are down and I get on with the release; updating the changelog, bumping the version number and running the tests. For various reasons this takes longer than it normally does.

20:30

Finally we're there; ts-loader 2.0.1 ships: https://github.com/TypeStrong/ts-loader/releases/tag/v2.0.1.

I'm tremendously grateful to everyone that helped out - thank you all!

Tuesday, 14 February 2017

@types is rogue

Or perhaps I should call this "@types and repeatable builds"....

The other day, on a React / TypeScript project I work on, the nightly CI build started failing. But nothing had changed in the project... What gives? After digging I discovered the reason; spome of the type definitions which my project depends upon had changed. Why did this break my build? Let’s learn some more...

We acquire type definitions via npm. Type definitions from Definitely Typed are published to npm by an automated process and they are all published under the @types namespace on npm. So, the react type definition is published as the @types/react package, the node type definition is published as the @types/node package. The hip bone's connected to the thigh bone. You get the picture.

The npm ecosystem is essentially built on top of semantic versioning and they take it seriously. Essentially, when a package is published it should be categorised as a major release (breaking changes), a minor release (extra functionality which is backwards compatible) or a patch release (backwards compatible bug fixes).

Now we get to the meat of the matter: @types is rogue. You cannot trust the version numbers on @types packages to respect semantic versioning. They don't.

The main reason for this is that when it comes to versioning, the @types type definition essentially looks to mirror the version of the package they are seeking to type. THIS MEANS THE TYPE DEFINITION CANNOT DO ITS OWN SEMANTIC VERSIONING. A simple change in a type definition can lead to breakages in consuming code. That's what happened to me. Let's say an exported interface name changes; all code that relies upon the old name will now break. You see? Pain.

How do we respond to this?

My own take has been to pin the version numbers of @types packages; fixing to specific definitions. No "~" or "^" for my @types devDependencies.

No respect semantic versioning? No problem. You can go much further with repeatable builds and made use of facebook's new npm client yarn and lockfiles (very popular BTW) but I haven't felt the need yet. This should be ample for now.

The other question that may be nagging at your subconscious is this: what’s an easy way to know when new packages are available for my project dependencies? Well, the Get-Package -Updates (nuget hat tip) for npm that I’d recommend is this: npm-check-updates. It does the job wonderfully.

Wednesday, 1 February 2017

Hands-free HTTPS

I have had a *great* week. You? Take a look at this blog. Can you see what I can see? Here's a clue:

Yup, look at the top left hand corner.... see that beautiful padlock? Yeah - that's what's thrilled me. You see I have a dream; that one day on the red hills of the internet, the sons of former certificates and the sons of former certificate authorities will be able to sit down together at the table of HTTPS. Peace, love and TLS for all.

The world is turning and slowly but surely HTTPS is becoming the default of the web. Search results get ranked higher if they're HTTPS. HTTP/2 is, to all intents and purposes, a HTTPS-only game. Service Workers are HTTPS-only.

I care about all of these. So it's essential that I have HTTPS. But. But. But... Certificates, the administration that goes with them. It's boring. I mean, it just is. I want to be building interesting apps, I don't want to be devoting my time to acquiring certificates and fighting my way through the (never simple) administration of them. I'm dimly aware that there's free certificates to be had thanks to the fine work of LetsEncrypt. I believe that work is being done on reduce the onerous admin burden as well. And that's great. But I'm still avoiding it...

What if I told you you could have HTTPS on your blog, on your Azure websites, on your anywhere.... FOR FREE. IN FIVE MINUTES?. Well, you can thanks to CloudFlare. I did; you should too.

This is where I point you off to a number of resources to help you on your HTTPS way:

  1. Read Troy Hunt's "How to get your SSL for free on a Shared Azure website with CloudFlare"
  2. Watch Troy Hunt's Pluralsight course "Getting Started with CloudFlare™ Security"
  3. Go to Cloudflare's website and sign up

It just works. And that makes me very happy indeed.

Friday, 6 January 2017

webpack: resolveLoader / alias with query / options

Sometimes you write a post for the ages. Sometimes you write one you hope is out of date before you hit "publish". This is one of those.

There's a bug in webpack's enhanced-resolve. It means that you cannot configure an aliased loader using the query (or options in the webpack 2 nomenclature). Let me illustrate; consider the following code:


module.exports = {
  // ...
  module: {
    loaders: [
      {
        test: /\.ts$/,
        loader: 'ts-loader',
        query: {
            entryFileIsJs: true
        }
      }
    ]
  }
}

module.exports.resolveLoader = { alias: { 'ts-loader': require('path').join(__dirname, "../../index.js") 

At the time of writing, if you alias a loader as above, then the query / options will *not* be passed along. This is bad, particularly given the requirement in webpack 2 that configuration is no longer possible through extending the webpack.config.js. So what to do? Well, when this was a problem previously the marvellous James Brantly had a workaround. I've taken that and run with it:


var config = {
  // ...
  module: {
    loaders: [
      {
        test: /\.ts$/,
        loader: 'ts-loader',
        query: {
          entryFileIsJs: true
        }
      }
    ]
  }
}

module.exports = config;

var loaderAliasPath = require('path').join(__dirname, "../../../index.js");
var rules = config.module.loaders || config.module.rules;
rules.forEach(function(rule) {
  var options = rule.query || rule.options;
  rule.loader = rule.loader.replace('ts-loader', loaderAliasPath + (options ? '?' + JSON.stringify(options) : ''));
});

This approach stringifies the query / options and suffixes it to the aliased path. This works as long as the options you're passing are JSON-able (yes it's a word).

As I said earlier; hopefully by the time you read this the workaround will no longer be necessary again. But just in case....

Sunday, 1 January 2017

webpack: configuring a loader with query / options

webpack 2 is on it's way. As one of the maintainers of ts-loader I've been checking out that ts-loader works with webpack 2. It does: phew!

ts-loader has a continuous integration build that runs against webpack 1. When webpack 2 ships we're planning to move to running CI against webpack 2. However, webpack 2 has some breaking changes. The one that's particularly of relevance to our test packs is that a strict schema is now enforced for webpack.config.js with webpack 2. This has been the case since webpack 2 hit beta 23. Check the PR that added it. You can see some of the frankly tortured discussion that this generated as well.

Let's all take a moment and realise that working on open source is sometimes a rather painful experience. Take a breath. Breathe out. Ready to carry on? Great.

There are 2 ways to configure loader options for ts-loader (and in fact this stands for most loaders). Loader options can be set either using a query when specifying the loader or through the ts (insert the name of alternative loaders here) property in the webpack.config.js.

The implicatations of the breaking change are: with webpack 2 you can no longer configure ts-loader (or any other loader) with a ts (insert the name of alternative loaders here) property in the webpack.config.js. It must be done through the query / options. The following code is no longer valid with webpack 2:


module.exports = {
  ...
  module: {
    loaders: [{
      test: /\.tsx?$/,
      loader: 'ts-loader' 
    }]
  },
  // specify option using `ts` property - **only do this if you are using webpack 1**
  ts: {
    transpileOnly: false
  }
}

This change means that we have needed to adjust how our test pack works. We can no longer make use of ts for configuration. Since I wasn't terribly aware of query I thought it made sense to share my learnings.

What exactly is query / options?

Good question. Well, strictly speaking it's 2 possible things; both ways to configure a webpack loader. Classically query was a string which could be appended to the name of the loader much like a query string but actually with greater powers:


module.exports = {
  ...
  module: {
    loaders: [{ 
      test: /\.tsx?$/,
      loader: 'ts-loader?' + JSON.stringify({
        transpileOnly: false
      })
    }]
  }
}

But it can also be a separately specified object that's supplied alongside a loader (I understand this is relatively new behaviour):


module.exports = {
  ...
  module: {
    loaders: [{ 
      test: /\.tsx?$/,
      loader: 'ts-loader'
      query: {
        transpileOnly: false
      }
    }]
  }
}

webpack 2 is coming - look busy!

So if you're planning to move to webpack 2, be aware of this breaking change. You can start moving to using configuration via query right now with webpack 1. You don't need to be using webpack 2 to make the jump. So jump!

Finally, and by way of a PS, query is renamed to options in webpack 2; a much better name to my mind. There's actually a bunch of other renames on the way as well - check out the migration guide for more on this. The important thing to note is that the old names work in webpack 2. But you should plan to move to the new naming at some point as they'll likely disappear when webpack 3 ships.