Skip to main content

Debugging slow VSCode TS project with tsc generateTrace

VS code feels slow. All of a sudden, saving took forever in a TS project. The first suspect was a recently added WallabyJS test runner extension. So I disabled all plugins but still, saving took several seconds.

My head then turned towards TS. Arrogantly I blamed some dependency but I had not a clue which one. Github TS wiki has some tips and tricks on what to do, including instructions to run a profiler. Turns out, a new profiling tool was introduced not too long ago in TS 4.1. 

The output looks like this.

Compiling the slowest source file takes more than 7 seconds!

There is one massively slow checkSourceFile call which, as the name says, compiles a single file. To my embarrassment, it was in my code 😱

The profiling output is rather hard to decipher. The UI shows which TS internal functions are called and references a separate JSON file containing the type names. In my case, the slow function(s) included some weird single character type names, so it was not much help.

The arguments are references to types.json file

I reverted to bisect debugging. I randomly removed code until all was back to normal. Turns out, the bug was in react-hook-form typing. I had an "extra" type in the form controllers reducing the performance massively. I did not dig too profoundly into why this happens, but I should definitely open an issue. 

// Fast TS compile
<Controller
control={control}
...>
</Controller>
// Slow TS compile
<Controller<MyInputs, 'myField'>
control={control}
...>
</Controller>
view raw Controller.tsx hosted with ❤ by GitHub

After the fix, the source file was gone from the trace tool visualization and vscode is back to normal speed.

Still, a bit slow component to compile, but much better


Comments

Popular posts from this blog

I'm not a passionate developer

A family friend of mine is an airlane pilot. A dream job for most, right? As a child, I certainly thought so. Now that I can have grown-up talks with him, I have discovered a more accurate description of his profession. He says that the truth about the job is that it is boring. To me, that is not that surprising. Airplanes are cool and all, but when you are in the middle of the Atlantic sitting next to the colleague you have been talking to past five years, how stimulating can that be? When he says the job is boring, it is not a bad kind of boring. It is a very specific boring. The "boring" you would want as a passenger. Uneventful.  Yet, he loves his job. According to him, an experienced pilot is most pleased when each and every tiny thing in the flight plan - goes according to plan. Passengers in the cabin of an expert pilot sit in the comfort of not even noticing who is flying. As someone employed in a field where being boring is not exactly in high demand, this sounds pro...

Canyon Precede:ON 7

I bought or technically leased a Canyon Precede:ON 7 (2022) electric bike last fall. This post is about my experiences with it after riding for about 2000 km this winter. The season was a bit colder than usual, and we had more snow than in years, so I properly put the bike through its paces. I've been cycling for almost 20 years. I've never owned a car nor used public transport regularly. I pedal all distances below 30km in all seasons. Besides commuting, I've mountain biked and raced BMX, and I still actively ride my road bike during the spring and summer months. I've owned a handful of bikes and kept them until their frames failed. Buying new bikes or gear has not been a major part of my hobby, and frankly, I'm quite sceptical about the benefits of updating bikes or gear frequently. I've never owned an E-bike before, but I've rented one a couple of times. The bike arrived in a hilariously large box. I suppose there's no need to worry about damage durin...

Emit structured Postgres data change events with wal2json

A common thing I see in an enterprise system is that when an end-user does some action, say add a user, the underlying web of subsystems adds the user to multiple databases in separate transactions. Each of these transactions may happen in varying order and, even worse, can fail, leaving the system in an inconsistent state. A better way could be to write the user data to some main database and then other subsystems like search indexes, pull/push the data to other interested parties, thus eliminating the need for multiple end-user originating boundary transactions. That's the theory part; how about a technical solution. The idea of this post came from the koodia pinnan alla podcast about event-driven systems and CDC . One of the discussion topics in the show is emitting events from Postgres transaction logs.  I built an utterly simple change emitter and reader using Postgres with the wal2json transaction decoding plugin and a custom go event parser. I'll stick to the boring ...