Hacker News new | past | comments | ask | show | jobs | submit login
Deepkit – High-Performance TypeScript Framework (deepkit.io)
251 points by nikolay on June 8, 2022 | hide | past | favorite | 129 comments



I'm not a fan of the idea of leveraging TS types at runtime. This is just a lock-in to TS, even if type annotations become a thing in JS. I don't like ORMs that use runtime types either. Most of the time, I want to write raw SQL.

So as an experiment, I created a library that statically types raw SQL:

https://github.com/nikeee/sequelts

The idea is to parse the SQL queries using TS's type system. The parsed query is combined with the database schema and therefore, we know what type the query will return.

This is especially useful due to TS's structural type system. It's also zero-overhead due to the entire functionality being just TS types, which vanish as soon as we compile. It therefore also works JS-only.

However, it's just a proof of concept. I'm working on an ANTLR target for TS types, so that the SQL parser can be generated. A game changer will also be the integration with sql-template-tags [1] (which would make this actually usable).

This is just for selecting data. Time will tell if it's feasible to also type-check mutating queries.

The primary use-cases will target SQLite/D1.

[1]: https://www.npmjs.com/package/sql-template-tag


Here is another TypeScript library for validating types of SQL queries: https://github.com/MedFlyt/mfsqlchecker

It uses the actual PostgreSQL parser against your project's database schema to give 100% accurate results. Guarantees that all of your queries are valid, and that all input and output types are correct (and will also auto-generate/quickfix the output types for you in VSCode)


This is very cool. I like that it doesn't use a build process nor code generation. And especially that the project doesn't need to re-implement parsing and type inference.


Hah a few months back I released https://github.com/ivank/potygen

Similar idea - statically typing queries. Mine was mostly me playing around with recursive decent parsers seeing if I can actually parse SQL with it, and it seems to work OK, at least for Postgres.

It does require a "build" pass to generate the types, but I've added some additional bells and whistles like auto formatters and on hover info about columns / views / tables etc. Once you have the sql AST its pretty easy to build a lot of cool stuff around it.

It's all pure TS and doesn't use parser generators like ANTLR. We've been using it in prod for a while now and seem to be working alright - its mostly types anyway, but does have a runtime component, since sql parameters in node Postgres are too basic for our use case.

It all started from the amazing https://github.com/dmaevsky/rd-parse which showed me that you could build crazy complex parsers with like 100-200 lines of code.


I don't agree with this being a cons

> Requires knowledge about SQL & your database

For me that's a pros, because it is transferable knowledge.

And

> No type safety

Is a bummer, if you could check at compile time or some other way that your queries are valid it will be cool, in rust is a crate that does exactly that sqlx[0], and besides sql being verbose I found so easy and enjoyable to work with, it's so easy to know exactly what the query does, with ORM's is easy to have a query that's hard to know what does and the only way to be sure is running and printing the query.

[0] https://crates.io/crates/sqlx


PgTyped is another high quality alternative: https://github.com/adelsz/pgtyped.


Using PGTyped daily. Works well. Wish nullable columns could be coalesced though.


Looks like a great start.

My own library Zapatos[1] is less ambitious (it doesn't try to parse raw SQL) but has similar goals. It also has no truck with runtime types.

[1] https://jawj.github.io/zapatos/


I've used Zapatos and can vouch for its effectiveness. It doesn't bind you into any specific way of setting up your database or defining models, but provides an extremely easy way to integrate with other TS libraries in order to get typed queries.

Thank you for your incredible work on this.


That's great to hear! You're welcome. :)


author of ts-sql[0] here, this looks great (and a way more practical approach!)

[0] https://github.com/codemix/ts-sql


This is very slick!

> Time will tell if it's feasible to also type-check mutating queries.

What do you mean by "mutating queries"? When the underlying schema changes? For instance when a new column is added to the `video` table, or if `video.id` changes to a varchar?

Haven't dived in deep enough yet, but is this using the DB engine's explain query, or I guess my question is how does it interrogate the underlying db/schema if the schema isn't supplied as code in the setup? Or at least, how would you envision all that happening?


By mutating queries I thought of UPDATE, INSERT and DELETE.

It should be possible to parse those and check if they obey the schema, so raw SQL that updates/inserts/deletes data could also be checked.


Ah! Always struggling with terminology specifics when talking about DBs/SQL.

Sometime when people use the term SQL, they mean DDL, DQL (data query lang) and DML (data manipulation lang), other times they mean just queries (DQL).


Wow, I was literally thinking about this like a week ago, wondering how viable it would be. This is great work. The ability to compile ANTLR to TypeScript types would potentially be a game-changer.


I just want to say that this looks fantastic, especially the runtime types stuff - the idea that types are values hasn't caught on yet but it's great to see movement in this area. A long time ago I did something very similar to this for Flow (https://gajus.github.io/flow-runtime/#/try) but typescript is a much more suitable target. Congratulations on an amazing job, I really hope you have plans to sustain this project for a long time, because it's very ambitious but very compelling.


Great to see a new option in the TypeScript ecosystem. A fully featured framework. Brilliant. Could be a great alternative to Adonis.js which is one of my favourite nowadays: https://adonisjs.com/


TypeScript's type system can be very powerful and flexible - creating types at build time in a manner of speaking. Here's an article I wrote on an aspect of this called mapped types. You can see how types can be composed of other types.

https://www.lloydatkinson.net/posts/2022/going-further-with-...


This project is attempting to boil the ocean but I'm skeptical it can do each of these things well, considering information the dozen listed libraries don't have documentation. Including:

- D.I. https://deepkit.io/documentation/injector

- Topsort "A fast implementation of a topological sort/dependency resolver with type grouping algorithm" https://deepkit.io/documentation/topsort

- Templates https://deepkit.io/documentation/template

I'll stop there, I'm too lazy to do their work for them. Other red flags

- why is a very specific sorting algorithm bundled in?

- its claimed their RPC implementation is 8x faster than gRPC, specifically gRPC-js. Don't you think Google has the resources and the incentive to make gRPC faster? It's likely this is not an apples-to-apples benchmark.

- MongoDB ORM. It's a fine database, but it's also proxy indicator of web developers who only breath JavaScript. - Message broker. Very opaque, but it's likely this is coupled to a very specific pubsub provider.

- API console for testing HTTP calls. This is a solved problems with tools for testing APIs like Postman. The fact that the team is trying to solve all sorts of problems, each of which is standalone product, makes feel they are trigger happy at reinventing the wheel.

- For the view layer, it seems hard-coded to Angular. That's probably a no-go unless you don't care about what component library framework you have to use.

This project seems to be a single author project. Kudos for being ambitious, but I would heed caution at trying to solve every problem all at once. Maybe lean into the ORM aspect and make that best in class first.


> MongoDB ORM. It's a fine database, but it's also proxy indicator of web developers who only breath JavaScript.

Yet the author appears to be writing the parser in C++: https://mobile.twitter.com/MarcJSchmidt/status/1534323199147...


> considering information the dozen listed libraries don't have documentation.

I don’t think this is a “Show HN” or “Launch HN” posting. It’s possible the author is still working on code and docs, and has no idea the site’s been posted here.


That's right. It's still being worked on, so a lot of documentation doesn't exist yet. But work is also being done in parallel on a book/new docs that covers all areas.


To be honest, I'm baffled by the utter excellence of this.


It’s an in-progress project. Regarding topsort, my guess is the algorithm was useful for some part of the project, and the author decided to make it available generally.

> Don't you think Google has the resources and the incentive to make gRPC faster?

You’d think that, but A) Gmail, and B) having worked for a Google competitor, I can assure you a passionate developer can write more performant code than what a massive corporation will generally produce.

> I would heed caution at trying to solve every problem all at once.

Agreed, but I hope the author throws caution to the wind. I’m tired of having either A) 1 million npm dependencies or B) boiling the ocean in my own projects, so it’d be nice to have a batteries included project you can rely on.


Yes, it covers a lot. But that is in my eyes necessary to provide high performance throughout the whole application and is necessary to offer something that is fundamental unique. It's a matter of providing very fast basic building blocks to get that. For example, you can not make ORMs faster without first having a very fast deserialiser, or a fast RPC implementation without having very fast binary validation and encoding. So, it all works together, is decoupled and split up in multiple packages so they can be used separately.

A lot of stuff is not yet documented, but they will. We are already writing new documentation/book which is roughly 40% done that covers already a lot more than what is currently seen on the website, but this HN post was posted at a somewhat bad timing.

> - its claimed their RPC implementation is 8x faster than gRPC, specifically gRPC-js. Don't you think Google has the resources and the incentive to make gRPC faster? It's likely this is not an apples-to-apples benchmark.

Yes, gRPC is fast. But not gRPC-js. I invite you to test it yourself, all our benchmarks are in the git repository. It is explained here on how to run them: https://deepkit.io/documentation/benchmark. Feel free to join our discord so I can guide you through the process. On the reasons why gRPC-js is slower: It uses Proto Buffers which is binary, and parsing binary is notoriously slow in JavaScript. By utilising the runtime type information of TypeScript we can generate JIT binary encoder/decoder for very specific types which are much faster.

> - why is a very specific sorting algorithm bundled in?

Why is providing something as a dedicated package a red flag? It's fundamental necessary for an UoW ORM for example and itself very useful. My other topsort package in PHP has 4 million installations, so it's only logical to bring the same functionality to TypeScript.

> Message broker. Very opaque, but it's likely this is coupled to a very specific pubsub provider.

It uses its own server, but will probably be replaced soon. Again, somewhat bad timing of that post.

> - API console for testing HTTP calls. This is a solved problems with tools for testing APIs like Postman

It doesn't solve what Postman solves. It's a way to automatically document your HTTP and RPC API plus allows to execute/try them right in the browser. There is Swagger UI which comes closer to what API Console is.

> - For the view layer, it seems hard-coded to Angular

There is actually no view layer. It has a server-side DI-enabled template engine based on JSX and a desktop UI kit (which is based on Angular). Both have their own use-case and are not required.


This looks really interesting. The possibility to define runtime types basically removes the need to write data object validation blocks, greatly alleviates database sync etc. I'm sure I'm not the only one to have implemented runtime types on top of Typescript for this reason, albeit not at this low level and therefore incurring a certain performance penalty.

So performance would be one reason to chose Deepkit for me. The integrated framework looks interesting too. Something that make me hesitant is the relative small user base and size of the community (mainly one single developer?) considering this would basically be to commit to a new programming language. And if the Typescript team should decide to do anything similar then the project might be dead in the water?


I never understood the "single developer" argument. React itself does not even have more than 5 active developers (https://github.com/facebook/react/graphs/contributors), Vue literally have a single one (https://github.com/vuejs/vue/graphs/contributors). You can apply this to almost any library. I mean, what's that argument for?


It means that the project depends on the whims or life events of a single person - a single point of failure if you like: critical patches may not be merged, the maintainer might turn malicious etc. More people on the project means more eyeballs on the code. This seems self-evident to me (especially if the project is core in ones tech stack) so I'm really surprised at the "what's the argument for?". Am I missing something?

(I don't see how you can cite Vue as an example; Evan is the main contributor to Vue, but he is certainly not the only one.)


This seems very interesting and I feel such a cohesive framework is needed in JS/TS, as an alternative to cobble together various unrelated packages.

On the other hand, I wonder what will be the level of support for each of these libraries - just the ORM or desktop UI seem like a lot of work.


One man's cobbling together is a another's choosing the best library for the job ;)


How on earth did selecting coherent, focused, modularized libraries become taboo for so many programmers?

I will never understand. Nor sympathize.


A bit late but I need to answer this: Because several of those coherent, focused and modularised libraries get un-maintained after a couple of years. Then your team needs to make another selection for many coherent, focused and modularized libraries.

I suppose some developers enjoy doing this again and again. Many don't.


I just want to put a word in for Marc Schmidt. He's a great engineer, with great product sense. This project will move fast, and if you think it's missing something essential, it's probably on the roadmap and will be realized quickly.


thank you so much!


It looks really cool but I feel it's too much to take in a one bite. Making the framework do (almost) everything makes me think it won't be able to cover all the nuances of single-purpose libraries. I'd be much more open to trying it if it was just a general wiring around Fastify & Prisma for a better Node.js backend rather than a one tool to do everything.


They do have the option to just pick and [choose from a variety of libraries](https://deepkit.io/library). I like the look of @deepkit/type


That is a non-goal. One of the biggest advantages of Deepkit and its runtime type feature is that you can reuse TypeScript types throughout your whole application stack. This does not work with Fastify and Prisma. Let me give you an example.

Let's say you build a little user management app and need a `User` type. So you define one:

  interface User {
    id: number;
    username: string;
    created: Date;
    email: string;
  }

If you want to build a REST API that returns that user, you can just do that:

  router.get('/user/:id', (id: number & Positive): User => {
    //...
  });

Deepkit automatically serialises the object into JSON using the type information in `User`.

You want to accept a User object for creating a user in the database? No problem:

  router.post('/user', (user: User): User => {
    //...
  });

You probably want to enrich the interface with additional constraints:

  interface User {
    id: integer & Positive;
    username: string & MinLength<3> & MaxLength<32>;
    created: Date;
    email: Email;
  }

And with that the framework automatically validates all incoming requests against the User object and its containing validation constraints. It also deserialises it automatically from JSON for example. If the `User` is a class, it instantiates a class instance.

But you don't want to require all fields, like id and created should be set by the server:

  router.post('/user', (user: Omit<User, 'id' |'created'>): User => {
    //...
  });

This works equally well. And the best part: you get API documentation for free with the Deepkit API Console, just like you know from Swagger UI.

Ok, but we need to save the User in our database, right? We do not need to duplicate the User in yet another language like with Prisma. Just re-use the User interface and annotate the fields.

  interface User {
    id: integer & PrimaryKey & AutoIncrement;
    username: string & MinLength<3> & MaxLength<32> & Unique;
    created: Date;
    email: Email & Unique;
  }

We still use the very same interface, but have added additional meta-data so you can use that exact same type now also for the ORM.

  router.get('/user/:id', async (id: number & Positive, db: Database): Promise<User> => {
    return await db.query<User>().filter({id}).findOne();
  });

Next, let's go to the frontend. We obviously have to request the data from our rest API. We can just do that and reuse the `User` interface once again.

  const userJson = await fetch('/user/2');
  const user = cast<User>(userJson);
`cast` automatically validates and deserialises the received JSON. We import `User` wherever needed and this works because the meta-data is not tightly coupled to any library, so you won't pull in for example ORM code just because you use database meta-data. You can literally use TypeScript's powerful type-system to your full advantage in many small to super complex use-cases and let the framework handle all the validation, serialisation, and database stuff.

And like that you are able to reuse types everywhere: From database to http router, configuration, RPC, API documentation, dependency injection, message broker, frontend, and more. That's not possible in this form in any other framework or when you combine lots of third-party libraries and glue them manually together.

Deepkit separated the functionality in libraries which would allow you to use their features in Fastify and Prisma, too, but that would mean you lose one of the biggest advantage of all: Reusing types.


IMO reusing types is an anti-pattern. It leads to knots of functions which are all tightly coupled since they “use the same types” even though they actually only need small subsets of those types. Changing one of these functions will then often break several of the conjoined functions.

When you start to hear people say of their PRs “it works but I just have to fix the types”, that usually means the codebase is trying to re-use too many types. In my experience.

The beauty of TypeScript is that you can have two different very narrowly specified types and the compiler will tell you if they’re compatible or not.

Reusing types is throwing away the most valuable feature of TypeScript.

For me, the type signature is part of the function signature and therefore should not be re-used.

Imagine what a disaster it would be if function signatures were reusable:

    sig UserDatabase(user, db)

    create: UserDatabase =>
      db.write(“users”, user)

    update: UserDatabase =>
      db.get(“users”, user.id).update(user)

It’s too much. DRY can be taken too far.


It's not an anti-pattern to reuse types, it's an anti-pattern to apply something you don't need or apply too much of it, which is exactly what you are talking about. There is of everything a "too much". It's up to you how and in which part of your app you will use that. The difference is: you have the opinion to do so now where it makes sense. Nobody says you have to reuse everything up to a point where it becomes counterproductive. Having the option vs not having the option. It's definitely better in my opinion than being forces to even learn new inflexible languages/APIs to describe your model like JSON-schema, decorators, openAPI json, GraphQL, Prisma, ... - whether you reuse is up to you.


yes yes yes! this is so true


Ok you got me sold on the idea - being able to configure just one schema for everything with validation and deserialization does sound amazing. And really appreciate the detailed reply!

While it does sound quite great I have to try it in practise to understand would it work for me. To prevent framework fatigue I really try not to switch frameworks too often which is why I'd like to be able to reuse what I know already. And your scope sounds really big. Yet I always have liked how TS/JS community keep pushing the envelope constantly!


> Next, let's go to the frontend … const user = cast<User>(userJson);

Wait, how can Deepkit's `cast` work on the frontend? Do you compile to WASM?

Quickly skimming the current Deepkit homepage and intro blog post didn't seem to hint at this.


It makes the types available in runtime. You can take a look at the current WIP version of the new upcoming Deepkit book which explains it in great detail: https://deepkit-book.herokuapp.com See section "2. Runtime Types"


Thanks – the question I had was answered in the "Bytecode" part of that section: https://deepkit-book.herokuapp.com/deepkit-book-english.html...

My summary is that the DeepKit compiler is a JS transpiler (used as a ts-node or webpack plugin, or conveniently/invasively as a replacement of node_modules/typescript) that produces code like this:

    //TypeScript
    export type TypeA = string[];

    //generated JavaScript
    export const __ΩtypeA = ['&F'];

    //TypeScript
    function log(message: string): void {}

    //generated JavaScript
    function log(message) {}
    log.__type = ['message', 'log', 'P&2!$/"'];
Where '&F' and 'P&2!$/"' are the compact bytecode translations of the types ("&" means string, "F" means array) and '__Ω' is a prefix used to avoid naming conflicts.

Personally, I'd love to see a live example of code in & compiled code out on the homepage. I'm sure it's on your roadmap :)


This looks to me a lot like the FastAPI framework in Python. Thanks for the demo.

> This does not work with Fastify and Prisma

The point of Prisma is that it generates types for you from the Prisma schema, right? Are the generated types not interchangeable throughout the application?


They surely can be used in the app, but a) requires that code generation and b) you are limited by the expressiveness of Prisma's DSL. The DSL supports much less than TypeScript. The last time I looked: No support for generics, conditional types, mapped types, (const) enums, index signatures, class instances (+methods), and more. Basically you lose the full power of TypeScript that way. Also you would have no way of using that interface in validation and serialisation, which is more important than just using it in type checking - you would need to rewrite the entity in another format like json-schema or zod-like libraries, which means at the end you still duplicate it.


I was pretty sure that Prisma also supported generating OpenAPI schemas and GraphQL, but maybe I'm wrong.

Fair point about expressiveness, but if your data must cross application boundaries then you are somewhat limited in expressiveness no matter what, because your types need to be understood by arbitrary clients.

Lack of generics is pretty annoying though. It's probably my biggest annoyance with GraphQL.


Good point. That's why I'm working on a C++ version of TypeScript's type checking./computation so it can be used in all languages. That would allow us to use TypeScript types literally everywhere and remove that barrier.

Prisma's DSL has no syntax for validation constraints, so even with openAPI generators it's incomplete.


Very interested in this. Is there a good place to stay updated on your progress?


I’m also interested in any updates regarding this entire stack.


I'm really bad at marketing, but I regularly tweet [0] and chat with the discord [1] community. I'm trying keep the followers up to date with the C++ typescript stuff. Feel free to join

[0] https://twitter.com/MarcJSchmidt [1] https://discord.gg/U24mryk7Wq


Will do.


Join the Discord!


one interesting thing is that it looks like the built-in debugger can also do static analysis and draw a call graph of the execution context.

[0]. https://deepkit.io/assets/screenshots/debugger-http-light.pn...


The special compiler for runtime types is available separately at https://deepkit.io/documentation/type, this is great! I will definitely be testing this in our RN app.


I remember seeing this before and as far as I can tell, nothing important has changed since then.

It's still a project where the bus factor is extremely low: the author has 95% of all the commits. The project is basically reinventing every wheel in the typescript space, too, and it's going weird ways with a custom compiler and crazy reflection.

It's dogfooding hard, so it's all built on its own things, and all of those things are definitely going to have bugs because one guy wrote all of it, and it's just him to fix them.

I think it has a lot of neat stuff and cool things I'd otherwise like to have, but there is basically no way I can adopt this framework for anything serious because of the reasons above.


The benchmark page doesn't load anything for me https://deepkit.io/benchmarks.


yeah, it looks like data points are missing.


It's fixed.


I'm unable to run the benchmarks at all. If I try to run it on my local machine, there is something wrong with the TypeScript setup or something, can't even run them.

If I try the provided Dockerfile, Alpine can't install python for whatever reason.


Cool jooq[1]-query-syntax-like ORM! https://deepkit.io/library/orm

It does not seem to have the code-generation (for the db table DTOs) though.

I also like typescript's "string & MinLength<3> & Email" type combining.

A quick edit-test-loop must be a great selling point of this kind of FW compared to FW in more strongly types languages (that have slower edit-test-loops due to compilation) that I usually prefer.

[1]: https://www.jooq.org/


The typing is interesting as I have been leveraging Typescript as an IDL for code generation between Go backend and TypeScript frontend. (GraphQL and gRPC were not expressive enough to define services, DTOs and JSON schema forms). Parsing Typescript's AST is simple. My guess is they generate code based on statically analyzable type arguments. I'm eager to try their approach. I am currently parsing jsdoc comments and tags for metadata which doesn't benefit from Typescript's checker.

jooq is awesome. I was using that with DropWizard way back when. Now I'm just Go.


We generate byte-code from the TypeScript AST, embed that into the emitted JavaScript, and execute them on-demand in a virtual machine during runtime. The result is a type-object which contains the computed type of a TypeScript type expression which can then be used for various use-cases. We use them for ORM, validation, serialisation, RPC, HTTP Router, etc. You can learn more on how it works here https://deepkit.io/blog/introducing-deepkit-framework and in very detail in the TypeScript Bytecode proposal: https://github.com/microsoft/TypeScript/issues/47658. Soon in a new doc/ebook it is explained in great detail how it works and the API of it can be utilised to build for example your own libraries/functionalities on top of it.


It’s a great read. Thanks.


Rather than do the typical HN thing of commenting on something irrelevant (like the landing page), I'll comment on the actual thing being presented:

First, I really like that this leverages Typescript to its fullest. I'm a big fan of doing things like runtime reflection of compile time types.

However:

``` id: number & PrimaryKey & AutoIncrement = 0; ```

One thing that seems confusing is, (presumably) -- these compile time types are being translated into runtime type behavior. (I'm guessing by your Typescript transformer?). While this is very cool and concise, I suspect it could be super "magical".

In your case, has Deepkit felt too magical?


Having TypeScript types in runtime feels definitely like magic at the beginning. A TypeScript transformer makes them available in runtime so that libraries like the ORM, validator, or serialiser can use them to do whatever they do. You can read more about that in the introduction post: https://deepkit.io/blog/introducing-deepkit-framework


I skimmed some of Deepkit and looked at your responses here. I have to say these parts of Deepkit really seems solve a major pain point that I have been investigating fairly recently. Kudos.

The places where I consider to use TypeScript are UI/CRUD/frontend centric, because I feel like the ecosystem around JS/TS is really rich and (finally) maturing when it comes to these things. So its really all about coordinating UI/user feedback, validation, serialization/parsing etc. And all that under regular change/evolution.

Now my conclusion so far has been to just try and keep it simple and separated and do the plumbing between all those parts explicitly. There is a benefit to that as well, but it does require more boilerplate and more bug surface. I will definitely tinker with some of the Deepkit libs to reassess this, because I think it might just do the right things in those areas.


This is really wild. It feels almost like a way to wedge dependent types into the system, at least that's what the MinLength<2> & MaxLength<5> stuff feels like. Amazing.

How was it digging into the compiler api for that stuff? Every time I've poked around with that it's felt like a mess. Any resources you'd recommend?


This is excellent work.


Thank you!


This is/was a common concern about frameworks in Python like Attrs and Pydantic. And for the most part, no: it's just the right amount of magic. This is maybe a bit more magical than those, but as long as there is a coherent underlying data model and sufficient debugging capability, this kind of thing is usually a productivity boost with little downside.


Aren't the latter two definitions for the ORM side of things? Just guessing, I haven't looked into it too much.


> Rather than do the typical HN thing of commenting on something irrelevant (like the landing page)

I think you just did! (comment on something irrelevant to Deepkit, like the generic and uninformative landing page, that is)


CLASSIC! Replying to an unrelated comment by pointing out how unrelated that comment is.


reminds me of Python's Fastapi - https://deepkit.io/framework


One thing I think TypeScript is missing (on the server at least) is a Laravel-like (in that it covers everything, not it's exact style) framework. I'll be interested to keep an eye on this.

I wish it had serverless support but I know that's a whole other beast, I have really enjoyed the majority of my typescript on serverless work and would love a formal framework instead of what I've cobbled together.


NestJS[0] is the most popular framework to solve pretty much this exact issue.

It does support serverless[1], as well as many other features you can expect from such a framework.

[0] https://nestjs.com/

[1] https://docs.nestjs.com/faq/serverless#serverless


To be honest, I don't think NestJS is even remotely comparable to anything like Symfony/Laravel.


Serverless is being worked on. We indeed plan to be Laravel/Symfony/Spring boot in the TypeScript world. Time will tell if we will be successful.


It already looks like SpringBoot at first glance. You'll have troves of Java devs flocking to something like this.

There's a general dislike towards Java/C# code nowadays, you can't argue that spring/springboot brought stability to the language and its mainstream usage.

Something like this in TS makes it feel familiar, and after trying the examples I am quite excited!


You’re looking for Redwood


This looks very ambitious and at first glance looks amazing. I’m quite cautious of using node frameworks but I’m definitely trying this one out.

I like elixir/phoenix but I miss types and loving that this project is embracing types at its very core.


This was on HN a few weeks ago and may be of interest to you:

https://github.com/floodfx/liveviewjs


Thanks for letting me know! It does look interesting but the “killer features” of phoenix for me is easy tests with SQL sandbox and exmachina + mix code gen. I’m not too into LiveView through I should work with it more.


Ah fair enough! I've never used elixir/phoenix, and I was under the impression that it _was_ LiveView. Now I get that Phoenix LiveView is just one component.


The logo bears a striking similarity to Lightspeed's...like, flip it along the vertical axis and the resemblance is uncanny.

https://www.lightspeedhq.com/


ORM, DI, reflection, config, validation, events...

So... a hipster SpringBoot then? The history doesn't repeat but it rhymes indeed :)


If people say this is a modern version of Spring Boot, then I'm all for it.


> A runtime type system for JavaScript, powered by TypeScript types.

why would I choose this over Deno?


Curious, why would this be related to Deno?


Noob question here, but why do this:

`string & MinLength<3> & Unique;`

When one could use a Value Object Type?


I have one silly question. Is this for backend or frontend?


Both, but for the backend it's an entire framework that replaces something like Express/NestJS and for the frontend it is not. In the frontend can be used the validator and serialiser, so that types can be easily shared between frontend and backend. There is also a frontend desktop UI kit, but that is not the main focus (and was mainly separeted in a package because it was developed for GUI apps like API Console and Framework Debugger in Deepkit). You should use in the vast majority of cases the regular React/Angular/Vue frontend stack.


Is there a link you can share to your backend framework plans?


There’s a library called @deepkit/desktop-ui and another @deepkit/mongo. I guess you can use it anywhere, just pick libraries that makes sense on your environnement.


This look interesting, but a bit of a headache to scale.


Can you explain why this gives headache to scale? I ask because it is designed to scale well.


I closed the page after waiting about 5 seconds for the page to load.

Yes, I realize that the "loading" was really just Javascript effects to present each element individually. I'm sure that artists and designers love that stuff. But the people interested in a programming language framework are people who have things to do and value their time - otherwise they would not be looking for a framework.


Not to mention that the landing page says absolutely nothing about what it is. Just saying that it's a framework means nothing. A framework can be a lot of different things.

Sorry to sound so negative.


I think if they lifted the diagram from the docs [0] to the front page it would make things a lot more clear.

0: https://deepkit.io/framework


Yeah it appear that they made the framework page their home page now, which is much better.


I agree with you, the landing page is very poor, if they are trying to sell the framework to someone it need to have more clear information. Maybe they can learn from next.js, remix.run and redwoodjs landing pages.


I agree. We will do that once we shift focus on marketing/website, which is when we are out of alpha. Currently its a pragmatic low time-budget site that has only as purpose to tell that this exist, not to sell it like the companies you linked.


I agree, so I removed it. Designers sometimes recommend something that will not work out in the real world, so I have no hard feeling removing that.

Website is still not optimised though. It's a big SPA that loads most things in advance and followed a pragmatic mindset. Better done than perfect. It's low priority to make that faster, however shouldn't be a problem though as the target audience usually has big bandwidth.


> Designers sometimes recommend something that will not work out in the real world

There's also common sense. There are 33 words on that page, and it took 5 seconds for them to appear. No amount of design can justify that.

Additionally, the work will be served better if https://deepkit.io/framework is the home page, not the nothingness that is the home page


Yeah, I thought about that, too. Will probably do that, thanks! //edit: I just did that


Awesome!


They've removed the animations now. The page still loads ~2MB of Javascript, but at least it doesn't faaaaaaaaaaaaaaadeeeeee iiiiiin ooooooone woooooord aaaaat aaaaa tiiiiiime.


I agree. You can jump straight to the framework docs [0].

[0]: https://deepkit.io/framework


- High performance

- Scripting language

Pick one.


For me, "high-performance TypeScript framework" implies that it's high-performance compared to other TypeScript frameworks rather than high-performance period. In the benchmark pages (https://deepkit.io/benchmarks), the components of the framework are compared to other TypeScript libraries, which seems to support my theory. But your interpretation is valid too, I think. English is a second language for me so I'm not sure if this is something up for debate or not.


High performance as “minimal performance cost on your actual js environnement” ?


Yes, that is what is meant, also in terms of development speed. Of course, there will always be faster implementations in other languages, but in context of JavaScript/TypeScript, it's high performance. In the introduction blog post can be seen more about why this is: https://deepkit.io/blog/introducing-deepkit-framework


Not entirely true.

https://www.techempower.com/benchmarks

Has entries for JS and PHP quick high in the ranks.


There's a super interesting article from the just-js author explaining how he got a JS implementation into that top 20: https://just.billywhizz.io/blog/on-javascript-performance-01...


The only JS entry I see in the top 20 uses a custom JS runtime that has "no support for ES modules". Not exactly a production ready application.


That's moving the goal post though. The argument was "High performance != Any Scripting language", which is clearly not true.


PHP achieves a decent 40% in the composite score table though:

https://www.techempower.com/benchmarks/#section=data-r20&hw=...

I suppose you'd need to have Facebook-level scale for infrastructure to cost more than development in a "faster" language.


Still high performance scripting though?


This is not always true. I recommend looking at Bun.js (https://bun.sh) and following its creator Jarred Sumner on Twitter (@jarredsumner)


This is just a bundler, no?


It's also written in Zig, not a scripting language, so that can't be what they mean either.


haha Solid Engineered...


Rather than just deriding someone, you could of course offer them a way to improve, by suggesting that it should be 'and solidly engineered', or 'engineered solidly'.

The whole sentence / tag line could probably do with improving, but maybe this is not a native speaker, making derision all the more unacceptable.


Made in Germany”. Also, the grammar has been corrected by now.


My goodness, I advice you to read the implementation details of this and be humbled about your own aptitude for programming.


Can we just let JS/TS die and compile something better to WASM already?


I don't think JS/TS would die if you tried to kill it.


> Precisely aligned, gently optimised, solidly engineered.

If you write junk like this to describe your software project I am immediately going to assume you are completely hopeless and not to be trusted at all.

Not one of these 6 words convey any meaning at all. It's like the bad old days of calling everything web scale.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: