Hacker News new | past | comments | ask | show | jobs | submit login
Jsonnet – The Data Templating Language (jsonnet.org)
118 points by tomas789 on March 27, 2023 | hide | past | favorite | 93 comments



We've been using Jsonnet to generate Pods for our complex DAGs at Opsgenie, wrote a similar one using Tekton: https://mustafaakin.dev/posts/2020-04-26-using-jsonnet-to-ge...

But if I was doing it again now, I would just use https://cuelang.org/ or https://dagger.io Jsonnet is really hard to debug.


> Jsonnet is really hard to debug.

yes! As always, it's mostly a tooling issue. Surely, some aspects of the language like lazy evaluation, make it even harder, but fundamentally there is no reason we cannot significantly improve on the overall experience.

And this is not just about debugging when things go wrong. It's also hard to navigate the codebase and understand where are the templates that you care about. Things can happen at multiple levels (that's the feature!) and so it can be quite hard to figure out which files you need to touch if you want to change something in the output. I'm working on a tool to answer that question: "if I wanted to change this field in the generated out put, what are the places in the input files that contribute to produce this value"

    $ ursonnet testdata/child.jsonnet '$.deployment.spec.template.spec.containers[0].resources.limits.cpu'   
    testdata/common.libsonnet:27 
    testdata/base.jsonnet:5 
    testdata/common.libsonnet:23 
    testdata/common.libsonnet:22 
    testdata/config.libsonnet:5
The tool lives in https://github.com/mkmik/ursonnet . I got the basics working but it doesn't work on my larger codebases due to some bug I didn't have yet time hunting down. Having some interest/feedback/help from the community would help making this a reality.


Agreed.

I dislike that it's json to transform json, it becomes a rats nest. I had similar experiences with xslt back in the day.


Its not JSON though, its more of a stripped down purely functional language - functions are quite central.


In that case why not just use Ocaml (or F#) to create objects in a real language and then just serialize them to JSON?


Its mind-bogging amount of work to write correct algebraic types that make sense for most JSON and YAML config files. Pretty much every property is optional, but not really as some combinations are allowed while others not. If you are lucky there might be a JSON Schema with `oneof`s you can automatically map to unions, if not there will just be documentation (e.g. Concourse YAML). Sometimes there will even be two algebraic datatypes encoded in one object, so you would need 2 -> 1 value mapping (or 4 union tags for the possible combinations, whatever suits your fancy)

Since jsonnet is side-effect free, types are somewhat less needed. You can instead execute your code, which will generate values and which can report arbitrary runtime errors (all the way to the language server, even)

(Types would still be awesome for writing funcitons, though)


I can't seem to wrap my head around CUE. Most of my Jsonnet is creating a nice abstraction layer using functions. It seems CUE specifically does not like that approach. Maybe I'm just looking at it the wrong way.


Yeah, code functions aren’t the right mindset for CUE. The essence of it is narrowing constraints and defaults. You have a bunch of constraints that make a template, like Server. Then something extends that, like StageServer. Then it has to get so specific (MyAppStageServer) that all the values are concrete or can be filled with defaults. The constraint narrowing for a field is something like int, 1-8, 1-4, 2. This makes multiple template use (like multiple inheritance) not a nightmare.


Similar experience. I ended up switching to hocon with some extensions.


Configuration as code, but in a quirky language - how about no? I'm sure someone uses this, someone created this, and so it might seem like it's useful, but we have enough clownishness out there with yaml, json combined with js, json with eval and so on.

Create a framework in a normal language, but one which is very limited by design with regard to what is able to access - Skylark/Starlark is a good example[0], "limited Python" and use that. The rest works the same way, take an input in this language/framework, generate outputs.

[0] - https://bazel.build/rules/language


Tried it[0], worked reasonably well. Be prepared for strong opposition from traditional “devops” folks (really just rebranded sysadmins) and boneheaded management “who don’t mind yaml” and will drag everyone down.

[0] - https://github.com/cruise-automation/isopod


My employer (who I don’t speak for) used this for a long time. Im not normally particularly opinionated about tech choice, but I can very strongly say do not use jsonnet unless you enjoy adding an esoteric unsupported language to your job configs for no reason.


> do not do X for no reason.

You are not contributing much here. Could you elaborate what problems they were trying to solve and what other problems were met?


Anecdotes are absolutely a contribution.

Regardless, the reason is exactly what I mentioned, it’s esoteric, specific, and not backed by a community of knowledge.


What would you use instead?


Not OP, but I can highly recommend just using the programming language you already use to make key-value map to and then converting that to JSON. That way you only have 1 syntax to worry about. In my case that’s Python (using dictionaries and json.dumps)

You could also try for a templating framework (like jinja2) but then you have 3 syntaxes colliding: the programming language you call the templating engine with (e.g. Python for jinja2) the templating language itself (e.g. jinja2) and then JSON as well.


This works for some languages but not others (some typed languages are particularly ill-suited to the problem of writing CLI tools to dump heterogenous dictionaries and/or objects with many optional fields)


Assuming you need something beyond basic JSON, a well-understood scripting language like JS or Python. Bazel if you want something more build-focused.


I can definitely sympathize here - in every context, just straight JSON/YAML configuration seems never expressive enough, but the tooling created in response always seems to come with sharp edges.

Here are some of the things I appreciate about Jsonnet:

- It evals to JSON, so even though the semantics of the language are confusing, it is reasonably easy to eval and iterate on some Jsonnet until it emits what one is expecting - and after that, it's easy to create some validation tests so that regressions don't occur.

- It takes advantage of the fact that JSON is a lowest-common-denominator for many data serialization formats. YAML is technically a superset of JSON, so valid JSON is also valid YAML. Proto3 messages have a canonical JSON representation, so JSON can also adhere to protobuf schemas. This covers most "serialized data structure" use-cases I typically encounter (TOML and HCL are outliers, but many tools that accept those also accept equivalent JSON). This means that with a little bit of build-tool duct-taping, Jsonnet can be used to generate configurations for a wide variety of tooling.

- Jsonnet is itself a superset of JSON - so those more willing to write verbose JSON than learn Jsonnet can still write JSON that someone else can import/use elsewhere. Using Jsonnet does not preclude falling back to JSON.

- The tooling works well - installing the Jsonnet VSCode plugin brings in a code formatter that does an excellent job, and rules_jsonnet[0] provides good bazel integration, if that's your thing.

I'm excited about Jsonnet because now as long as other tool authors decide to consume JSON, I can more easily abstract away their verbosity without writing a purpose-built tool (looking at you, Kubernetes) without resorting to text templating (ahem Helm). Jsonnet might just be my "one JSON-generation language to rule them all"!

---

Though if Starlark is your thing, do checkout out skycfg[1]

[0] - https://github.com/bazelbuild/rules_jsonnet

[1] - https://github.com/stripe/skycfg


Why do you think jsonnet is quirky? It has pretty normal components overall: local bindings with lexical scope, imports, functions. Certainly more pedestrian / closer to most general purpose languages. The only strange bit may be thunks / lazy evaluation, but its definitely possible to opt out of relying on their behavior.


> Create a framework in a normal language

i hate working with SBT. Makes maven pom.xml less cruel in comparision.


It always bothered me that the "tutorial" (https://jsonnet.org/learning/tutorial.html) is using mixing cocktails as examples, rather than actual networks/infrastructure which is a much more likely use cases for using jsonnet (or, I'm much less into mixing than my fellow developers/operators).


If you didn't want a cocktail before working with jsonnet, you will want one after.


I'll take a Moscow Mule please.


The excellent Jsonnet training course [1] by u/Duologic somewhat fills this niche by having you play around with k8s resources and using external packages.

[1]: https://jsonnet-libs.github.io/jsonnet-training-course/


See also https://dhall-lang.org/

> Dhall is a programmable configuration language that you can think of as: JSON + functions + types + imports


There's also CUE: https://cuelang.org/


Cue is a little different I think. It doesn't attempt to make writing configs less tedious through the use of functions.

It is more about schemas and reliably merging documents as far as I can tell. But weirdly given its focus on schemas I couldn't find any way for a document to link to a schema in-band!


> But weirdly given its focus on schemas I couldn't find any way for a document to link to a schema in-band!

In cue there's no real distinction between "schemas" and "documents". If you say:

   value: string
   value: "abc"
then cue "unifies" the definitions for `value`, sees that "abc" is a string, and therefore `value` is valid.


Yes I know, but there's no in-band way to say "this file conforms to the schema in this other file" like you can with XML or JSON schemas.

For Cue you would probably just want to #include a file... but as far as I know you can't do that.

That makes it far less useful for IDEs, linters and so on.

I don't think there's any reason they couldn't add that feature. Just a bit odd that they haven't already.


It makes writing configs less tedious using templates.

Not really sure what you mean by "documents" though.


Some config formats like Yaml or Jsonl support multiple "documents" per file


I mean multiple files.



relevant: https://github.com/tweag/nickel#comparison-with-other-config...

FWIW, "YAML Typing = None" is for sure wrong, that's what the `!!map` annotations (https://yaml.org/spec/1.1/#tag/syntax) are for, and a very common RCE vector for dynamic languages since they can cause execution when the parser instantiates the types


We use Jsonnet with Tanka (by Grafana) for managing our Kubernetes manifests. It does take a bit to pick it up properly. The language itself is straight-forward (coming from Nix it was easy), however, learning how to use ksonnet is the challenging part.

I'm still tinkering with how to organize things. I don't think I've fully grasped the purpose of mixins yet.


The Prometheus Operator and Thanos projects only officially provide jsonnet libraries and example implementations, rather than Helm charts (which are provided by the community, but I detest Helm charts anyway) so I've picked up Jsonnet in the last few weeks to build our new monitoring stack and I'm really impressed.

It's a massive learning curve but it has become really powerful and flexible.

I started trying to use Tanka but found the mental model a bit strange and struggled passing parameters into environments, so I tried Kapitan instead and that has been much more productive.


Original creator of both of those here. I recommend just generating things out straight with the `jsonnet -m` command, and then doing `kubectl diff` on a PR and apply on merge. Scaled huge amount of infra this way and it’s easily understandable and extensible.

To clean up the diff a bit I recommend using: https://github.com/sh0rez/kubectl-neat-diff

Hope it’s helpful!


I have to admit I'm a little starstruck to see your username replying to me. Thanks for all the effort and expertise you bring to bear on observability!

We're using exactly that diff-on-PR/apply-on-merge approach, checking the generated manifests in for easier reviews, but I found just using jsonnet -m made it difficult to DRY things out (e.g. set some URL values differently per environment) so I'd be curious to hear how you approach that?

Would you wrap the customised kube-thanos/prometheus-operator "calls" in another library?


Nothing I dread more than needing to edit the tangled web of jsonnet files that constitute our grafana dashboards. A jumble of spaghetti with hard to find documentation.


Aren’t you supposed to edit grafana dashboards via the UI?


I'm not the person you're replying to, but in our company we edit them in the UI but check the results into source control (so that we can re-deploy the dashboards consistently). I imagine for more complicated dashboards you could get stuck editing them manually, and it's probably the kind of thing where once you do it one time, you're stuck doing it for the rest of time.


There are a few very specific situations in which I might consider generating a dashboard without the ui, but even then i wouldn't use jsonnet.


The nix language, albeit for a package manager, is really similar. You get json-like "attrsets" and can have variables and functions. Then of course you can export it to different formats as well.

https://nixos.org

There is also the Dhall and Nickel languages providing features alike


My discussions with David Cunningham resulting in David producing jsonnet and me creating what is now known as overlays in Nix.


We use Jsonnet extensively in Aperture project for generating control policies.

- Policy spec is expressed in protobuf format

- GRPC gateway plugin is used to generate Swagger (OpenAPI v2) spec from proto files

- Jsonnet bindings are generated from Swagger spec.

- Blueprints are implemented using Jsonnet bindings. The users generate policies from blueprints by providing configuration in yaml (using aperturectl CLI) or via jsonnet mixin.

Relevant links:

- Aperture Blueprints (Jsonnet): https://github.com/fluxninja/aperture/tree/main/blueprints

- Generated doc example: https://docs.fluxninja.com/development/reference/policies/bu...

- CLI for generating policies from blueprints: https://docs.fluxninja.com/development/reference/aperturectl...

- Policy spec: https://docs.fluxninja.com/development/reference/policies/sp...


I'm interested in the code that processes the swagger/openapi into jsonnet, can you link to that?


It’s customized to our policy spec. But you can learn from this and adapt it to your spec.

https://github.com/fluxninja/aperture/blob/main/scripts/json...


Template languages are odd. Just use normal programming language to generate whatewer you want.


The issue with normal programming languages is that they are rarely properly sandboxed and deterministic. Jsonnet can't wipe your hard disk or steal your passwords.

Something like Starlark is probably a good option.


Jsonnet had a huge opportunity, but due to the extremely slow and heavily opinionated lead, this project literally exists only at Grafana now!


self should refer to the immediately containing object, the outermost object should be called top (or whatever). Or perhaps found relatively with self.out.out.out...

Imagine having to take a nested configuration fragment that refers to its own pieces via the absolute self, and trying to plant it in some other configuration, at a different nesting level.


> Or perhaps found relatively with self.out.out.out

Jsonnet’s internal predecessor /counterpart/inspiration (Google BCL/GCL) has something like this and not including it in Jsonnet is a feature :).


yep, mixing lexical and dynamic scope was the hell indeed.

A convention I use with jsonnet is to use locals to "anchor" some important objects in the lexical scope. My rule is to name the locals with exactly the same name as the field whose object they refer to. Example:

    {
      deployment: {
        local deployment = self,
        spec+: {
          replicas: 3,
          template+: { spec+: {
            local spec = self,
    
            containers: [
              {
                name: 'foo',
                env: [
                  { name: 'REP', value: std.toString(deployment.spec.replicas) },
                  { name: 'SA', value: spec.serviceAccountName },
                ],
              },
            ],
            serviceAccountName: 'foo',
          } },
        },
      },
    }
Adding these locals gives the right amount of friction to avoid using this all the time, while at the same time having a predictable naming for the locals. The jsonnet LSP tool can easily resolve that local since it's only a lexical thing


Some of our customers use Jsonnet to generate JSON-based configurations, including Terraform and even Postman collections.

They choose to use it because:

- it allows them to keep things DRY when replicating a configuration artifact across environments

- it is accessible for folks who are not as comfortable with program control flow and other programming constructs. They can start by copying and pasting JSON.

- it provides more advanced control flow and logic features for those who need them

- it provides features for template parameter validation

- it is not that hard to learn

Some things could be improved for sure. Better IDE support, error messages which are not the best. Maybe also a more confident community.

Overall, Jsonnet does seem to hit a pragmatic sweet spot for a bunch of use cases, regardless of the merits of other approaches mentioned in this great discussion!


I evaluated Jsonnet and Starlark for configuration generation at a previous job, and found Starlark more practical -- the similarity to Python made it much simpler for everyone to pick up and use.


Honest question, why not just use… JavaScript? I get that this is scoped to only templating json, but the syntax is already diverging pretty heavily from actual JavaScript, while also trying to pretend to be similar? You’re using operators and a self/this keyword and things, but I can only assume they obey different rules.

Could just be config.js. (Or just generate JSON in your app programmatically, it’s a pretty simple well-defined syntax and there’s already a hash-to-json library for whatever platform you’re using)


Scoping to templating json lets the library enforce a really strict contract that you don't get with javascript. For example,

* Accessing a missing field is totally fine in JavaScript and returns undefined; this is an error in Jsonnet.

* JSonnet guarantees no side-effects, not true of JS.

* When you run a jsonnet file, you have a consistent CLI that handles things like external vars, whereas you'd have to write your own wrapper to do this in JS.

* JSonnet lets you export to multiple different config formats, whereas JS only support JSON really. Again, you'd have to write a wrapper to do this.

* JavaScript would let you return any valid JS object, your wrapper would need to validate.

Like yes, you could write your own JS library to mimic a lot of these behaviors, but the constraints are what make the tool useful. I do wish it had better typing though, so you could more easily inspect the structure of the resulting object.


1. can be fixed with typescript

2. can be fixed with Deno - https://deno.land/manual@v1.31.3/basics/permissions

3, 4, 5. Why not have that wrapper to solve those? (AFAIK you do need the wrappers for jsonnet too)


If the constraints are what makes it special, why is it just not very constrained? If you really wanted something totally focused on having multiple configs sharing some base config template, then you wouldn’t be able to do anything other than overwrite specific keys entirely. (json deep merge) Jsonnet seems to let you concat strings, do some math using it’s standard lib (?!), reference other parts of the object etc.

I also don’t think “object matches type description” is a niche that JS is lacking, there’s tons of libraries out there to assert that.

You’re just trading footguns for different footguns IMO.


Yep, it's all about them tradeoffs. JS gives you absolute freedom, JSonnet doesn't. JS is easier to debug, JSonnet isn't because it's very declarative.


Yes that's a common problem with config languages. They're torn between competing priorities:

1. Fast to parse with a small engine, good error messages, safe to evaluate.

2. Powerful, can express config with arbitrary logic.

In Conveyor we try an alternative approach. The config is HOCON, which is a superset of JSON syntax designed for human readability/writability/convenience first and foremost, so it's got a very nice and clean feel to it. You can see an example here:

https://github.com/hydraulic-software/github-desktop/blob/co...

It can be parsed with a normal-sized config library and the errors you get are reasonable.

But then what if you hit the limits of what it can express? We added support for "hashbang includes":

    include "#!script.js"
You can embed arbitrary commands in the config which are executed when found unless the app is running in untrusted mode. The script is expected to produce more config on stdout, which is then included. This lets you encode only the minimal needed logic using a full programming language, whilst the rest stays declarative.


In case you want a fresh opinion, every time I stumble upon a config like this, I wish it was just a .json, .js, .py, heck even .pl or .php file rather than that.

Why. It’s unclear what’s the syntactic structure of it at a glance. Yes, in stricter languages there’s more of /[()[\]{}"'`,;]/, but also I know for sure what’s a literal, what’s an identifier, what’s a key, what’s a number or a date. The whole structure and ordering is obvious.

Same issue I have with nginx, terraform and other ad-hoc half-languages half-formats half-templates. The worst part is that you have to hack them once a year, but it never “sinks in” for good even if you read the docs.


Thanks. HOCON is a superset of JSON so you can write JSON if you like. There's also a spec:

https://conveyor.hydraulic.dev/7.2/configs/hocon-spec/

You can also convert HOCON to JSON and back. I find it a lot easier to work with configs when you can get rid of superfluous syntax, can use comments, substitutions, include files etc. But you don't have to use those.


I honestly... don't know. I have been using Jsonnet to programatically generate Grafana dashboards, because AFAIK the only official library to generate them is written in Jsonnet [1].

In my experience, it works, but it doesn't really give me any distinct advantage and instead it gives me some headaches. Maybe for things that look almost like JSON it'd be helpful, but the moment you start dealing with more complex generations you start finding lack of typing, lack of IDE support, lack of easy debugging, etc, fairly problematic. For example, something I really dislike is that due to how the expressions are evaluated, the only way to add debug/trace statements is to use them to "transform" a value you're going to use, if you don't use the result of the trace in the final output, the trace does not appear.

Also, I really dislike the error messages. Again, due to the lazy evaluation design, when you mess up something the error message might appear deep in some unrelated call stack and with a jarring lack of context. Debugging it is a real pain.

1: https://github.com/grafana/grafonnet-lib


> debugging is a real pain

yes! As always, it's mostly a tooling issue. Surely, some aspects of the language like lazy evaluation, make it even harder, but fundamentally there is no reason we cannot significantly improve on the overall experience.

And this is not just about debugging when things go wrong. It's also hard to navigate the codebase and understand where are the templates that you care about. Things can happen at multiple levels (that's the feature!) and so it can be quite hard to figure out which files you need to touch if you want to change something in the output. I'm working on a tool to answer that question: "if I wanted to change this field in the generated out put, what are the places in the input files that contribute to produce this value"

    $ ursonnet testdata/child.jsonnet '$.deployment.spec.template.spec.containers[0].resources.limits.cpu'   
    testdata/common.libsonnet:27 
    testdata/base.jsonnet:5 
    testdata/common.libsonnet:23 
    testdata/common.libsonnet:22 
    testdata/config.libsonnet:5
The tool lives in https://github.com/mkmik/ursonnet . I got the basics working but it doesn't work on my larger codebases due to some bug I didn't have yet time hunting down. Having some interest/feedback/help from the community would help making this a reality.


Grafana have been working on a replacement for grafonnet but unfortunately haven’t shared much detail. It sounds like they completely rewrote the dashboard spec using CUE, but there will be converters available at release.

In some of the discussions where the community is trying to figure out how to future-proof current dashboard definitions, Grafana Labs has also recommended this Python tool (unofficially I suppose) by Weaveworks, called grafanalib: https://github.com/weaveworks/grafanalib


You may be thinking of the https://github.com/grafana/thema library, which is the extracted version from some other projects / internal tooling.


Here’s the discussion, “Roadmap: As-code,” that I’ve been tracking: https://github.com/grafana/grafana/discussions/39593

Thema is mentioned there, as “the presumed successor to grafonnet,” but that hasn’t officially been confirmed and it sounds like bigger changes might be underway. Discussions are also happening elsewhere, like their Slack, but that link is the most complete overview I’ve found.


Thema is still under heavy development for sure. I chat with Sam about it on the CUE slack from time to time. I have a related effort I am working on for data model evolution, tracking, and transformation generations as part of https://github.com/hofstadter-io/hof


Agreed. Tomorrow they’ll realize that templating often requires string capitalization and other operations and will start reinventing Math, String.prototype and exception handling, if not yet.

Nevermind, https://jsonnet.org/ref/stdlib.html

It’s just a new language with a new runtime, for some reason marketed as a json templater. Literally any language with “json” module is at least equivalent to it, but more familiar to developers.


> Honest question, why not just use… JavaScript? I get that this is scoped to only templating json, but the syntax is already diverging pretty heavily from actual JavaScript, while also trying to pretend to be similar?

Jsonnet's stated aim is to be a superset of JSON rather than something similar to JavaScript.

Having used Jsonnet for a while, it's nice in that it makes it relatively easy to take existing JSON and incrementally turn it into templates. You wind up with the ability to create some nice abstractions for elements of JSON. The close coupling between the syntax of the emitted output and the Jsonnet script itself makes it far easier to write a correct template than when using something like gotmpl to create Yaml in a Helm chart.

Whether or not this is worth another language is a judgement call, but it's not that hard to learn or work with, so I've tended to find Jsonnet a nice tool to have around.


Used it previously to template some Terraform module configurations. The way the flow would work would be YAML config -> Jsonnet + defaults -> Terraform + Spinnaker

We wanted to avoid having to inject a more complex piece to the puzzle. It worked okay at first, but I knew from the get-go that wasn't going to hold for very long and eventually there was a Python app that took the YAML config and the Jsonnet+defaults and mashed them together before sending the info to TF + Spinnaker.


One take on that idea: https://github.com/jkcfg/jK


>Or just generate JSON in your app programmatically

Or make an application run on a normal config.


jsonnet has lazy evaluation. This allows you to create mutual references between different parts of your configuration without having to manually topologically sort the evaluation order yourself.



Thanks for kubecfg! Happy user :)


How does this differ from Tanka? At first glance, they seem very similar.


kubecfg is much less opinionated. It will just take a jsonnet file, no need to follow any set directory format or hierarchy of environments.


kubecfg does add some features, like https:// imports, oci:// imports (oci bundles in OCI registries, transitively bundling all imported files with jsonnet-deps).

But yes, I strive to keep the "one file, one target, import whatever you need but explicitly" as much as possible.

I'm pouring some more time into the project and trying to implement some ideas I had for a long time but never managed to get them out. For example "Flags From Files" (https://github.com/kubecfg/kubecfg/blob/flagspec/docs/rfcs/r...) or "Caching + optional vendoring of immutable external deps".


"Any sufficiently complicated data templating language contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of a Lisp."

Change my mind. :-P


The first time I have heard of Jsonnet is that the Scala implementation is faster than Google's. I wonder if that is still the case.


Yes the go version takes way longer materialize our configs compared at my work compared to sJsonnet (from Databricks). The first time you switch it adds (or removes? I forget) a bunch of newlines at the end of files which is annoying to handle in git but otherwise it’s faster and the same.


Good to hear. Scala is amazing when used pragmatically.


I love go templates for this sort of thing, but obviously only works if you’re using golang


The discoverability of golang templates is terrible, IMHO, since it's missing a "dir(locals())" equivalent and every execution environment gets to make its own rules about what pipelines/functions are exposed

Look at helm as an example: https://helm.sh/docs/chart_template_guide/function_list/ is some of them, https://helm.sh/docs/chart_template_guide/accessing_files/#p... are some others, but they also glued in some version of https://masterminds.github.io/sprig/ So, short of (a) knowing that's the case (b) having 3+ bookmarks in your favorite browser to refer to those reference pages, how would anyone know what pipelines are available?

Separately, I dooooo nooooooot understand why every joker has to invent their own new thing when we have like 50 or so templating languages already. Golang may be an outlier in that competition due to the Google Promotion Packet Effect(tm) but how they came up with `{{ range }}{{ end }}` as sane syntax is some true facepalm, to say nothing of the same landmine that ansible stepped on by not switching jinja2's default characters: `{{` is not _yaml safe_


Helm does a poor job of documenting their functions, and when you use helm, you're not really using go templates, you're using helm's wrapper around go templates.

If you use go templates directly- it's very obvious what functions are available, there's the default ones from the Go template package, and there's whatever you pass explicitly. The go template package is very small and well documented.

What I love about Go templates, is that I can pass any function I want, using good old Go code (meaning I can use SDKs and whatever else). This is really powerful- we're able to have functions that go out to APIs and return data and it's very easy to manage


> helm, you're not really using go templates, you're using helm's wrapper around go templates.

Well, that's a tautology when having any contact with golang templates outside of quite literally just importing "text/template" and good luck, as your subsequent comment somehow says and yet misses the point. I draw one's attention to the sibling comment of yet another golang templating gizmo that injects its own cutesy functions with completely random naming into the evaluation namespace


wha's the best of the 50 in your opinion?


own-project plug: https://docs.hofstadter.io


Ruby's hashes have escaped!


My company uses this, but not for kubernetes. It's a nifty product




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: