Conf42 Cloud Native 2024 - Online

Hono: Multi-Runtime Web Framework for the Edges

Video size:

Abstract

Meet Hono, a web framework making it possible to run the same code on different serverless cloud providers like AWS Lambda and Cloudflare Workers, and different runtimes like Node.js, Deno, and Bun. We’ll explore Hono itself, Web Standard APIs, and WinterCG making this interoperability possible.

Summary

  • Nikolay Pryanishnikov is a full stack engineer at Station Labs. Recently he stumbled upon a really cool new web framework called Hono. Today he'll explain to you how it's able to run on different runtimes and different cloud services. Why you should consider building your next project with Hono!
  • An engine is a software that executes your code. A runtime is the environment in which code is being executed. Most popular engine is V eight, which powers Chrome. Your service needs at least one runtimes to execute yourcode.
  • One service does not actually equal one runtime. Modern runtimes potentially can have web standard APIs only. You can write your code once and run it everywhere. Windows G came along to increase the interoperability of runtimes.
  • Hono is a new web framework built on the latest web standard APIs. It's able to run on multiple runtimes and on multiple cloud services. One of my favorite features of Hono is that it's completely type safe.
  • Hono has a very simple API for middleware. Includes everything you want to do with authentication, caching course headers, logging and so on. You can get even very complex middleware done with Hono in a very easy way.
  • Another one of the most important features of API libraries is validation. Hono provides a zod validator for you. It's very easy to get going or even migrate your existing projects. Because API is so simple, it's easily extendable.

Transcript

This transcript was autogenerated. To make changes, submit a PR.
Hi everyone. One thank you for coming to my presentation. Today we'll be talking about Hono, a multi runtime web framework for the edges. But first let me take a moment to introduce myself. My name is Nikolay Pryanishnikov. I'm a full stack engineer at Station Labs. We develop some cool stuff on the blockchain with NFTs and regular tokens. In my free time I like to contribute to open source projects and develop my own. I also do security research, some bugbal, TCTF and things like that. Previously I was an original full stack contributor at Lido Finance, one of the largest blockchain projects by TVL. Before that I was a startup founder and the founder of a small web development studio. What's common between all of my developer experience is that I've been building a lot of APIs. Recently I stumbled upon a really cool new web framework called Hono. Today we'll be taking a look at it, and I'll explain to you how it's able to run on different runtimes and different cloud services. So the plan for today will be in two parts. First, we are going to be discussing JS runtimes. We'll understand the differences between a JS engine and runtimes. We'll have some examples of standalone and cloud runtimes, and we'll be discussing API interpretation of runtimes in general. Then we'll be switching to hono. I'll explain to you what Hono is, why it's awesome, and why you should consider building your next project with Hono. So let's begin with JS runtimes. First we need to understand what a JS engine is, and this is quite simple. An engine is a software that executes your code. However, it's not enough for a good developer experience, and a runtime is the environment in which code is being executed. It provides some nice features for us, like the event loop, so we don't have to manually schedule when a code is being executed. Then it provides some runtimes libraries for us and APIs. So for example, we don't have to write low level code to access the file system. The most popular engine is V eight, pictured here. On the left it powers Chrome. On the right we have at the top JavaScript core, which powers Safari on iOS and macOS for example. And then we have Spidermonkey, which powers Firefox for runtimes, in this case the standalone runtimes you probably heard about node Js. It's the most popular and the oldest runtimes. But we also have newer, more modern runtimes. On the right here at the top we have Dino and the newest one at the bottom, bun node Js runs V eight, Dino runs V eight as well, and bun runs JavaScript core. Then we have a lot of service runtimes, and your service needs at least one runtimes to execute your code. AWS actually has two lambdas and lambdas at the edge, Cloudflare has their worker D runtime for their workers offering, and Purcell also has two runtimes, the code Js runtimes and the edges runtime runtimes will have different APIs, and it's clearly visible for standalone runtimes. In this example, we are trying to open a file for node Js. You will have to import a module called read file. You will have to specify the file and the encoding for Dino situation is different. Dino is globally available, and you also have some nice helpers. So for example, you have a read text file helper, so you don't have to specify encoding. In our case, service runtime APIs are also different. So for AWS Lambda for the handler you will have an event object and a context object, and in order to return some data to the user, you need to use a return. For Versailles Node JS runtimes, you have a request and response object. To return some data, you need to call send on the response object. Ideally you would want to return here as well, but it's not necessary. It's going to be working either way. Next, we are getting a problem that one service does not actually equal one runtime. So Versaille, for example, has two runtimes, and you can't migrate from one to the other without code changes. Node JS is an older runtime. Edges is a newer runtime built on modern web technologies, but it's also more limited. And speaking about limitations, one of the service runtimes limitations is limited NPM package access. Back in the day there were runtimes when you couldn't even import NPM packages at all. Today's situation is much different. NPM package access can still be a problem. For example, you could be importing an NPM package which tries to access some API which is not available at the runtime. Which brings us to the second point, limited API set. Modern runtimes potentially can have web standard APIs only, which we'll be talking more about in a moment. Also, for service runtimes, direct database access might not be possible at all. So if you wanted to directly connect to your database and get some data from it or insert some new data, it might not be possible. You would need a proxy, or you can migrate to different runtimes. For Versailles edges as an example, we clearly see that although you can import NPM packages, you can only import ESM packages, not CGS. So if you're relying on CGS dependencies, then you are out of luck. You will have to change your runtimes from Versaille Edge to Versaille Node JS runtime, and then it wouldn't be using the latest web technologies and it's a separate problem. So Windows G came along. It's a community group created with all the industry leaders in the space from Cloudflare to Vercell. From standalone runtimes we have Dino, we have Node JS, and they have their mission as increasing the interoperability of runtimes. And for the end user, it means that ideally you would have the same code being able to run on different runtimes, different cloud services. If you don't like one service, you could easily migrate from one to the other, and so on and so forth. But how is it possible? And it's possible using the web standard APIs, and those are these standard APIs which will be available across runtimes. So you can have the same code running everywhere. And examples of such APIs is fetch. You can bun fetch not only on the standalone runtimes, not only on services, but also in the browser. You can write your code once and run it everywhere. We're talking about fetch specifically because it has the request and response object which you can now reuse in modern runtimes. So for Versailles, in this case, for the code JS runtime, you would get the API with a request and a response object. You would be setting up a status code for the response. You would sending some data and returning with the newer API for their edges runtime, you will actually get the request in form of the request you are familiar with from fetch. It's the same request in order to respond, you will be creating a new response object, the same from fetch, and you're going to be returning it inside. You can specify the thing to return to your users, you can specify the status code, and you can specify options like headers, for example. This API is much cleaner, it's much more error prone, and it's much better to use it. One example why it's better is the situation I ran into multiple times, even myself, even with workplaces I've been working at. What would happen is that you would be setting a status code, returning, setting some data to be returning, but then you would forget a return and then the rest of the code will be executed and in production you will run into errors and your handlers won't be working as expected. So you would have to spend time finding out the handlers which have this problem, debugging, fixing it, and just adding a return after the logic and adding an if statement. The newer API prevents this completely. If you are returning, you must return something, and if you're returning something then execution will stop and this situation will no longer happen. Let's switch to Hono. Hono is a new web framework built on the latest web standard APIs. That's how it's able to run on multiple runtimes and on multiple cloud services. Hono is flaming Japanese. It was created by Yusukiwada, a developer advocate at Cloudflare, and we've been talking about its flexibility, but it's actually very simple. It's small, it's fast, and it provides just a fantastic developer experience. Overall, it's been gaining a lot of popularity, and I think that's because so many developers are finding out about it and starting migrating their projects to Hono and developing new projects with Hono. Hono API is very simple. You simply import Hono, you initialize your app and you can start straight away setting up the endpoints. So in this case for our app, we are setting up a get endpoint at hello and we are setting up a handler which would return some Json. In this case we are returning a message with a simple string hello. As you can see, the API is very straightforward, it's very easy to get started. And if you are migrating your old project to Hono, there's basically nothing new to learn. It's so simple that you can get started straight away. Now let's get to the main points why it's awesome and you should use it ideally. First, it's very flexible. We've been talking about flexibility a lot, but Hono can actually migrate from server to serverless environments and vice versa. You can switch standalone runtimes, so you can migrate from node JS to Deno, for example if you want to, and back. And you can also switch hosting services. So if for some reason you don't like AWS, you can migrate to Cloudflare or the other way around. One of my favorite flexibility points is that it's so flexible that you can actually embed it in your next JS project and you will have one repository where you would have your front end and your backend in the API folder versaill will deploy everything for you. If for some reason tomorrow you decide that Versaille is not the best platform for your API, you can just take all the code from the API folder and deploy it in a standalone way without any code changes whatsoever. It's fantastic. So for flexibility you can deploy hono. You can run it on node JS, on Dino, on bun, pretty much any standalone runtime. And for services you can choose from a lot of different runtimes. So for AWS it even works on both the lambda and lambda at the edge. Same goes for cell. You can run it on the node JS runtime and the at runtime. For node JS runtime you will have to use an adapter, but it's simply one edit dependency and one or two change code lines. Hono is very fast. It's got the fastest router for cloud web workers and for Dina. Also, if you're using a node JS adapter, it's still going to be three times faster than express, so massive performance gains. Hono has different routers for different applications. By default, a smart router number three will be used here, which will automatically select the best router for you from a regular expression router and the tree router. Regular expression router is the fastest in the JavaScript world. Tree router is slower, but it supports all patterns. If you are running Hono in a one shot environment, in serverless environment, for example like fastly compute, then it's best to use a linear router. If you're using Hono in an environment with very limited resources, it's best to use a pattern router number five, it's the smallest one. One of my favorite features of Hono is that it's completely type safe. Everything you do, everything you touch has got types. It's very easy to see what kind of data you are dealing with. We're going to be talking about data validation as well. It's very easy to do with Hono as well. Everything inside Hono will be typesafe. It's a very good developer experience because it completed typesafe. It can deduct types for your application and you can create an auto generated client with Hono. Just in my mind, it's the killer feature of Hono. You would set up your app just as you would with other clients. Then you could deduct the types in here. App type from the app and this will store all the information about your API, all the endpoints, all the inputs and outputs of your endpoints as well. Then you would import a Hono client, initialize it with the app type, specify the base API path and you will have a completely typesafe client for your API with all the inputs, outputs, all the endpoints in a completely type safe manner. With my example with next JS you can actually have your API and your front end in one project and you can generate a client and use it on the front end. So you can interact with your APIs very easily and when you change your API it will be synced automatically. Nothing to do to your front end and you can change the way you interact with the API. Also, if you have a public API, that also means that you can export your client as a separate package so user can install it from NPM and get going straight away. It providers a very cool web developer experience. All the endpoints of your API get autocomplete and you never have to worry about what type of data is my API returning. Let's talk about middleware, one of the most important features of any API library. Hono has a very simple API for middleware. If you just want to use a login service, for example, you can just say app use and specify the middleware logger. In this case you can also specify the path where your middleware should apply to. In this case we are setting some course headers for post and you cloud also specify the method and pass. In this case we are setting basic authorization for just the post request. So get request will not apply this middleware, but post ones would. Middleware is very easy to develop with Hono and you get a lot out of the box. Here is a huge list. So the included middleware is everything you want to do with authentication, caching course headers, logging and so on. And there's a lot of third party middleware as well, like sentry error integrations, zot validation, open API and stuff like that. Pretty much everything you would need has already been developed and you can use today. But if you don't actually find something that you want, you can easily create it. All that you need to do is to have one function with the context and the next function. In the first example, you can lock anything from the request or the response objects. When you're done in this case dealing with the request object, you just call next. So the next middleware will be triggered or the request will end. It's very easy to adjust some headers, for example AWS. In the second example you just need to call next and set the headers you want. It's very easy. You can get even very complex middleware done with Hono in a very easy way. Another one of the most important features of API libraries is validation. And basically you don't want to be dealing with data, you don't know what's inside of. And it's best to use a validator. You can use zod. It's the gold standard for data validation in the JavaScript world. Hono provides a zod validator for you. Here is just my personal example. What I like to do is to specify the schema. First I will say that I'm expecting an object, a name, we see string type. Then I would set up my hono endpoint. I will say that the endpoint will be hello, I'm specifying the middleware here, in this case that validator, and I'm specifying the schema I just created. And you need to also specify the part of the request you are actually checking. If the format is wrong, users will get immediately an error so they can correct the data. But inside the handler you see the data is already validated, you just need to access it from the context request valid and specify which part of the request you are checking. And then it's going to be completely typesafe. So the name we're accessing here, it will be deducted that it's going to be a string. And we can say another major features of API Libraries has automatically generated documentation hono covers you here completely. You can just import the open APIs Hono package module. In this case you would initialize open APIs hono the same way you would do with regular hono. But then every route you set up will actually save the information. So like in this case we are returning an ID, an age which is a number, and a name which will be a string or the exact thing over here. Then you would set up a documentation on the dock endpoint. You'll specify some metadata and that's it. The documentation will be available on this endpoint. Users will just have to go there and everything will be present to them. You don't have to run any generation steps. Everything will work out of the box. Nothing left to do. Here are some takeaways from this presentation. First, standardized web APIs are great. We are using in our handlers for our server software the same request and response objects from fetch. We are just completely reusing them. Developers are already familiar with them, so they know what to do with them. This increases interoperability of runtimes. This means that you can migrate your API or your handlers from one service to another, from one standalone runtimes to the other. Thank you very much Windowsig for Hono it runs everywhere. It's very easy to migrate from runtimes or services without any code requirements, code changes requirements. It's got excellent developer developer experience. Everything is typesafe. Data validation is excellent. It's very easy to get going or even migrate your existing projects. Because API is so simple, it's easily extendable. So if you don't see a middleware you like, you can easily develop it. Please go and try hono. It's fantastic. It makes your life easier. And in case you don't like a service, for example, where you've been running Hono, you can easily migrate to another one. If you want to know more about Hono, here are some links. First three are for the Hono project. The last one is for Yosuke, the creator of Hono. Thank you very much for listening to this presentation. In case you have any questions about the presentation, or any feedback or any questions about Hono, please reach out to me and I will share my experience with.
...

Nikolay Pryanishnikov

Full Stack TypeScript Developer @ Station Labs

Nikolay Pryanishnikov's LinkedIn account Nikolay Pryanishnikov's twitter account



Awesome tech events for

Priority access to all content

Video hallway track

Community chat

Exclusive promotions and giveaways