Abstract Nonsense

A place for musings, observations, design notes, code snippets - my thought gists.

Talk: Implementing Efficient Language Models under Homomorphic Encryption

Today (11/11 Pepero Day in Korea), I had the pleasure of attending a fascinating talk on Implementing Efficient Language Models under Homomorphic Encryption by Donghwan Rho (노동환) at the Seoul National University Research Institute of Mathematics’ 2025 11.11 Symposium.

I am on holiday in Korea and didn’t want to miss this opportunity to learn more about homomorphic encryption - an area I’ve been been increasingly fascinated by as of late!

# Talk Abstract

As large language models (LLMs) become ubiquitous, users routinely share sensitive information with them, raising pressing privacy concerns. Homomorphic encryption (HE) is a promising solution to privacy-preserving machine learning (PPML), enabling computations on encrypted data. However, many core components of LLMs are not HE-friendly, limiting practical deployments. In this talk, we investigate the main bottleneck - softmax, matrix multiplications, and next-token prediction - and how we address them, moving toward the practical implementation of LLMs under HE.

# What is Homomorphic Encryption?

The core premise of homomorphic encryption is that it allows computations to be performed on encrypted data without needing to decrypt it first. Loosely speaking, the homomorphism property of HE allows for addition and multiplication to be preserved under the ciphertext transform. From these primitive operations (and some others, depending on the scheme), you can compile homomorphic circuits that perform semi-arbitrary computations on encrypted data. The big obstacle to overcome is that homomorphic encryption schemes are typically very computationally expensive, and so making them efficient enough for practical use cases (such as LLM inference) is an active area of research.

There are various schemes for homomorphic encryption, each with their own set of permissible operations and associated trade-offs, but I’ll delegate a more thorough treatment of the intricacies to this excellent blog post series by Jeremy Kun.

# Talk Summary

Full disclaimer: The talk was delivered in Korean, which I sadly do not speak, so I followed the talk partially via live translation and partially via the English slides (available here).

The talk was partitioned into three sections:

  1. An overview of the CKKS homomorphic encryption scheme and its applications to private transformer inference.
  2. The application of HE to fine-tuning LLMs: the replacement of the softmax in the attention layer with Gaussian-kernel attention and the use of LoRA to reduce the number of ciphertext-ciphertext matrix multiplications. This section is based on the paper Encryption-Friendly LLM Architecture by Rho et al., ICLR 2025 (OpenReview).
  3. How to address the lack of random sampling algorithms in CKKS HE schemes when performing next-token prediction. There’s a lot of intricacy here that I sadly couldn’t follow, but a particularly interesting component that stood out to me was applying the Travelling Salesman Problem to define an optimisation problem over the token indices to minimise cosine embedding distances between adjacent tokens. This section is based on the paper Traveling Salesman-Based Token Ordering Improves Stability in Homomorphically Encrypted Language Models by Rho et al (arXiv).

# What’s next?

Many thanks to Donghwan Rho for the talk! I’m looking forward to reading the papers in more detail when I have some time and hope to update this page or blog more about what I learn in the future!

Welcome to the Hyunam-Dong Bookshop by Hwang Bo-Reum

I really enjoyed this book! The original is in Korean, but the version I read was impeccably translated by someone with a masterful command of idiomatic English.

The book charts the life of a burnt-out Korean worker, Yeongju, as she embarks on a new venture by opening a bookshop. Seeing the stereotypes of Korean working culture explored through the interactions of Yeongju with her customers was both concerning and illuminating.

The moment she stepped inside [the bookshop], she relaxed, as if her body and senses basked in the comfort of returning to her workplace. In the past, she used to live by mantras like passion and willpower, as if by imprinting the words on her mind, they would somehow breathe meaning into her life. Then one day she realised it felt like she was driving herself into a corner, and she resolved never to let those words dictate her life again. Instead, she learnt to listen to her body, her feelings, and be in happy places. She would ask herself these questions: does this place make me feel positive? Can I be truly whole and uncompromisingly myself? Do I love and treasure myself here? For Yeongju, the bookshop checked all the boxes.

I enjoyed the fresh writing style; the way the protagonist’s inner monologue is rendered faithfully into prose never felt trite or overdone. The writing felt mellifluous and the book was a reflective read. It really is an ode to the love of the written word:

But now, she treated the silence as a day’s rest for her voice and was perfectly at ease. When she wasn’t talking, her inner voice grew louder. She wasn’t talking, but she still spent the whole day thinking and feeling. Instead of sounds, she expressed herself through the written word. Sometimes, she even wrote three essays on a single Sunday. But these belonged solely to her, and were never shared with anyone else.

I can’t say I’m this prolific a writer, but I suppose there’s something relatable about spending time in quietude writing and pondering.

… Koreans were raised in a culture where they were taught to be conscious of the eyes of others, which made them, Yeongju included, more self-conscious of how they were perceived. Perhaps this was what drew her to the writing of authors from abroad, to those who grew up in a different culture, and who were different in the way they thought, felt, and expressed themselves.

I think this interest is symmetric! It’s precisely why I enjoy reading works of foreign literature.

Dissonance before moments of harmony makes the harmony sound beautiful. Just as harmony and dissonance exist side by side in music, life is the same. Because harmony is preceded by dissonance, that’s why we think life is beautiful… Is there a way that will accurately tell us whether the current moment we’re living in is harmony or dissonance? How do I tell what state I’m in now? Hmm, you won’t quite know while you’re in the moment. It’s only when you look back that the answer is clear.

By the time Seungwoo finished showering, preparing dinner, eating, resting, and doing the dishes, the clock struck eight. This was when he turned into a completely different person. As he shrugged off the cloak of an ordinary company employee, it was as if he, too, put aside the responsibilities of his title, erased the preprogrammed thoughts and actions, and peeled off the facade of indifference. From this moment, every second belonged completely to him. Time was real.

For the past few years, the hours before bedtime were when he could be truly himself, diving deep into something that captivated his interest - the Korean language. He’d spent the past ten years immersed in programming languages, but he was no longer a programmer. Right now, he was just another ordinary company employee, dutifully checking in and out of the office every day. Immersing himself in the Korean language was tiring, but fun. He enjoyed having something to focus on whole-heartedly, devoting himself to studying something he liked. The energy expended at work, he recharged at home.

I strongly relate to this: I find I have almost boundless energy for dabbling with maths and CS outside of work (or whatever other random rabbit hole I find myself in). It’s a refreshing break from the day-to-day that I find restorative and joyful.

And last but not least, an interesting typesetting note I learned from the book jacket:

The text of this book is set in Minion, a digital typeface designed by Robert Slimbach in 1990 for Adobe Systems. The name comes from the traditional naming system for type sizes, in which minion is between nonpareil and brevier. It is inspired by late Renaissance-era type.

Euclidea: An interactive geometric theorem prover

After watching the excellent piece of exposition on mathematical exploration by 3Blue1Brown guest Ben Syversen: “What was Euclid really doing?”, I discovered the fantastic web and mobile game Euclidea.

Geometry was never my favourite branch of mathematics, I always felt more drawn to algebra and more symbolic and abstract forms of reasoning. But Euclidea really does a masterful job of capturing the joy of exploration and problem solving through the lens of scaffolded puzzles of compass-and-straightedge geometric constructions.

As you construct each solution, they get encapsulated and added to your toolkit of re-usable constructions. It’s a best way to demonstrate the power of axiomatisation and iteratively increasing the levels of abstraction.

Rust for everyone [video]

A while ago I mentioned that I thought that Rust was a fiendishly complex language at first glance. Well, that hasn’t changed. Don’t get me wrong, I love Rust, and I wish I had the time to dive deeper into it, but it has a steep learning curve.

That said, I recently came across a fantastic video, “Rust for Everyone”, presented by Will Crichton at Jane Street that explores the various tools he’s built to make Rust more accessible to newcomers. It’s not in the video description, but here’s an outline of the tools Will discusses:

  1. Aquascope: A tool that provides a visual representation of Rust’s ownership model, making it easier to understand how ownership and borrowing work in Rust.
  2. Argus: An improved Trait debugger that lets you drill down as much (or as little) as you want through a Trait type-check compiler error.
  3. Flowistry: I think this is the coolest tool here. It allows you to trace the ’effect-flow’ of your program, highlighting only the sections of code that could possibly affect a selected variable.

The font of all knowledge

TIL that Google Search has a font-based Easter Egg. If you search for “Georgia font”, the results page will change the typeface to Georgia. I initially thought it was limited to System-fonts, but Wikipedia shows that it works for Roboto too!

I discovered this when looking up various fonts for this blog. I initially specified the default browser sans-serif font in my CSS, but ended up switching to Georgia for the text-body as I find it more readable for longer posts. There are some beautiful fonts in Google Font I’d love to use, but for now I’m going to stick with the System Font Stack to avoid loading more external resources on page-load.