Niilo Keinänen

I give software companies the necessary techniques to handle massive data files and real-time data sources with best performance.

Showcases

1. Analysed bottlenecks and optimized whole process of parsing, uploading, saving, loading, caching and visualizing a data file (CSV/XLSX/XML/JSON)
Next, React

Before:

  • Maximum file input 84 MB

  • Load time 19.1 seconds

After:

  • Maximum file input 8 112 MB

  • 84 MB file loaded in 7.8 seconds

Able to load and handle 97x larger files!


2. Optimized time-series dataset loading from backend to frontend
Express, HTML, Parquet

Before:

  • Load time 72.26 seconds

  • Message size 455 MB

After:

  • Load time 5.38 seconds

  • Message size 32 MB

User gets the same data a whole minute faster!

About me

To quote one of my past university professors
In software engineering, you can really choose only 1 (or maybe 2) most important quality attributes.
For example, if your product should be extremely easy to use, then it probably has to take a hit on performance. You can't have it all.Of course, in practice, things aren't always so black and white. But what I do know is that performance-oriented software development is a special skill that most developers (and LLM's 😉) don't know.I've spent my humble 8+ years career in software development entirely focusing on performance. Over at LightningChart Ltd. we develop some pretty performance-intensive software. During this time, I've built a mind and skillset to make applications perform differently.Now I offer this expertise as a consultation service.

Contact

If you are thinking your software development team could potentially benefit from some outside perspective on performance, leave me a message!

FAQ

  • What tech stack or programming languages can you help with?

The most important aspects of my advise are not tied to any specific tech stack or programming language. As in, file formats, structuring data, and transferring data. So regardless of your tech stack, I can give you extremely valuable performance advice.In any case, I'm a web developer, so anything JS/TS is my comfort zone when it comes to actual implementation work.

  • I don't believe you can drop message size of 455 MB to 32 MB. Are you down-sampling the information?

You'd be shocked to know how memory-wasteful the most commonly used practices in transferring data are. Practices like string-based formats (especially JSON), row-oriented structures and structured cloning can result in A) duplicating entire datasets just to send/receive them and B) using easily 10 times more memory to represent the same information.

  • Our application already uses compression (gzip/brotli/etc.), what more can you offer us?

The first of my showcases presented at this page was previously using brotli compression. Yet I was still able to pump up maximum data capacity by almost 100x. Majority of my optimization techniques are still utilizeable even if your messages/files are already compressed.