Hacker News new | past | comments | ask | show | jobs | submit login
Node.js 20.6.0 will include built-in support for .env files (twitter.com/kom_256)
196 points by mariuz on Aug 18, 2023 | hide | past | favorite | 123 comments



(Warning: I am being Dennis Downer) Meanwhile, we're still waiting on WebSockets and it's been nearly 10 years.

Everyone should read the PR https://github.com/nodejs/node/pull/48890. It doesn't traverse parent directories, it doesn't have good overwrite/merge logic, multiline support, or variable expansion - things that dotenv has supported for years (sometimes with add-ons). This is another half-useful implementation that the majority will end up ignoring, and keep using better-suited third party packages for.

Node is committee'd to death, and the comments in that PR are further proof. I've been in the Node ecosystem since 2010, and my money is on Bun and Deno to lead the way forward.


> It doesn't traverse parent directories

So there is chance it won't be massive security hole enabled by default! Great!

> it doesn't have good overwrite/merge logic, multiline support, or variable expansion - things that dotenv has supported for years (sometimes with add-ons)

99% of what I used env files for was just a bunch of key-values for db passwords so their minimal implementation is probably good enough for most. Then again, that's experience from other than node ecosystems, and if config requires "multiline support, or variable expansion" we either make proper config file or a template for CM to deploy.


> So there is chance it won't be massive security hole enabled by default! Great!

True that. Sometimes committees lead to bloat. And sometimes committees lead to not adding magic to everything that ends up being abused by malicious actors. I love that deno and bun exist and are getting rapidly developed. By I also love the Node has been and continues to be a rock for the industry.


That is a strong argument for using external tools to define the environment, and not installing any of the sort that’d read a .env file on any kind of deployed environment at all (just local dev machines)

So another mark in the “why have this feature in the first place?” column. The whole concept is for dev ergonomics, but this one’s making that worse for security reasons? Ugh. Just leave it out entirely, then.


Fun fact - I added a pretty feature complete WS connection in Node core (https://github.com/nodejs/node/blob/main/src/inspector_socke...) 7 years ago (it is well-tested - every debug/profiling session goes through it). There's no public API for it because I did not want to deal with politics.

There's rudimentary HTTP server as well (no TLS and such though). All in C++. Should be fairly fast, based on expertise of the Chromium team.


I think it is fine for node to have limited, basic, built in support for .env files, and then leave the opportunity for the ecosystem to add in more features.

Easy things should be easy, and hard things should be possible through npm install.


Great way of putting it. It does seem a little off compared to how `npm run` works (it traverses up to a package.json), but I agree .env files could get sketchy.

Maybe it could traverse up to where it finds a package.json actually.


I'm the CTO of a popular Secrets Management platform. It's fair to say that I personally have a lot of experience with secrets and requirements around them, based on conversations with customers.

The primary missing feature here seems to be multiline support. That's super common for keys, certificates, and other JSON configuration. Based on the Node PR, they appear open to adding that later (big +1 on shipping incrementally).

The other missing feature that folks tend to heavily rely on is variable expansion. For that to truly work well, I recommend using a holistic platform like Doppler. That allows for expansion/referencing across environments and projects, like when you have multiple independent services that need access to the same set of secrets (e.g. database creds, error reporting tool, stripe key, etc). You can then update the secret once and have the change propagate to all the places it's used.

Lastly, I'd be remiss if I didn't mention the Doppler CLI and our own fairly unique support for .env files. We've traditionally taken a dim view of .env files because they represent a static, long-lived collection of sensitive information that lives offline. Often, these get checked into a git repo. This is probably fine for personal projects, but a major issue for companies and the security aware. However, .env files are a pseudo standard and folks want a way of continuing to consume their secrets via them. Our CLI's approach is to mount a named pipe that we can write secrets to when a reader attaches. That allows us to limit the amount of times the "file" can be read (e.g. once), it guarantees that the file's contents are unavailable once the application process dies, and it uses the same open/read interface as a standard file.

In all, this is an exciting development for Node. I'm glad to see more standard features make it into core and hope that multiline support is a fast follow.


> The primary missing feature here seems to be multiline support. That's super common for keys, certificates, and other JSON configuration.

Easy enough to just Base64 encode the value the way Kubernetes does


Check out keycmd[1], you might appreciate the ideas there. Disclaimer: it's my project.

[1] https://github.com/ClinicalGraphics/keycmd


I agree. Since you seem to know the area, do you have any comment on which contender (Bun or Deno) you see as most likely to take the crown, and also which you’d prefer to take the crown? By ‘the crown’ I mean the status of being the leading JS runtime outside of web browsers.

For me, Deno seems very well thought out, I’ve used it a fair bit and found it to work reliably and fast. I like the whitelist approach to system access although I worry that it could be a hurdle for adoption. I haven’t used Bun, just read about it. My sense is that Bun is more of a 1:1 replacement for Node, still tied to npm as a first class citizen, while Deno has a long term strategy to get away from npm (while supporting it as a legacy thing). Overall I’d prefer Deno to win, but I wonder if Bun has a better chance due to its closer parity with Node.


I think you're right, Bun has a better chance just because because it's a drop in replacement and because it genuinely outperforms the existing tooling by a good margin. Also, LLMs know Node.js a lot better than they know Deno which may be a significant factor in 2023 and beyond.


> Also, LLMs know Node.js a lot better than they know Deno which may be a significant factor in 2023 and beyond.

I'm not disagreeing necessarily, but wow, I really hope it doesn't come to that. The timeline where languages, runtimes (and by extension libraries, frameworks, and even tools) are chosen based on how well they're supported by LLM tooling sounds horrifying.

It would block adoption of better designed languages, tools and frameworks. If evolution is going to be stifled by LLMs, we'll miss out on paradigm shifts like JS Promises and ES6.


I think it'll get better over the next few years and might have the opposite effect. As soon as we figure out how to keep LLMs up to date with recent developments it'll be easier for humans to adopt new things, not harder. The LLM can consume all the docs in a few minutes and then you'll be able to query it very easily.


Humans are always used to working a certain way, and this fact benefits the incumbent technologies/tools/platforms of the day. This benefit must be outweighed by other factors, or we'd still be writing Fortran (or drawing on cave walls). The addition of LLMs to the mix might add to incumbency bias by being trained on whatever ecosystems are most popular, but I think this will be vastly outweighed by all the other things they bring to the table. We know that one thing LLMs are already pretty impressive at is translating between programming languages. So if anything, I can see LLMs launching a new era of diversification by making it easier for programmers to justify choosing a more esoteric/interesting language for a project without worrying so much about whether they will be able to find library code for their needs, as they'll be able to translate it without so much trouble.


I agree that it sucks, but realistically I think that's the way it already has been, just with stack overflow, blog posts, books, etc. Nobody wants to adopt the new/better stack because it's not mature, so it never gets mature.


> It would block adoption of better designed languages, tools and frameworks.

Tooling concerns are already are a major source of friction for adoption, nothing really special about LLMs in that regard.


I know some projects out there, like windmill.dev, have smarter people than me building things and they consider Bun the successor of Node. They offer support for Deno, but focus more on Bun support because of compatibility. This makes sense for two reasons, I think. One, the future seems brighter due to compatibility. Two, infrastructure work on Node compatibility transfers far more easily to Bun, but not to deno.

It’s early still but my sense is that Bun will gradually gain ground as a Node replacement, but Deno will gain ground as something else. I think that’s actually the original intent behind both projects, too. Bun wants to be a better Node, Deno wanted to diverge in order to avoid what the author considers fundamental mistakes in Node’s design.


After reading the comments on the PR, I don't understand the doomsaying here in your comment.


I had the same impression as you. They didn't even close the door on some of those "missing" features; rather they're being judicious about what to include in the initial release. Classic 80/20.


> It doesn't traverse parent directories, it doesn't have good overwrite/merge logic, multiline support, or variable expansion

Sounds good to me. If I need more one day I’ll use dotenv, but having basic support in the stdlib is great


I'm pretty OK with NodeJS being as basic as possible.


.env desperately needs a formal specification. I wrote a bit of code using some features python_dotenv offers but realized after the fact that other software might not parse it the same way. I’ve taken to instead doing my templating in Python itself via pydantic BaseSettings


Isn't it just environment variables but rather then setting it manually it's in a file?

So the spec is the same as regular environment variables?


There is no such thing as a spec for setting “regular” environment variables.

Well there is but that’s a much deeper concept in Linux, than what most people get exposed to, in the execve family of functions, where it is defined in C as an array of strings with the key separated from value only by =.

Unless you refer to how to do it in shell. That wouldn’t be applicable to a configuration language. Shell has a lot richer syntax and more complex logic, allowing variable templating and even running processes inside a string definition using $(). All the quoting rules around that must be different.


Different dotenv implementations disagree on weather quoting is possible (with quoting comes multi line support). There is more opportunity for disagreement, like weather comments are supported or empty lines are ignored. But all that I used support comments and empty lines (though not all of them support quoted values).


No, python_dotenv allows some rudimentary templating in the .env file. So it’s not just specifying env vars


That's silly, but I don't think that's solved by a spec. Even with specs there is still JSON-LD, JSON 2.0, JSON API, etc.


Nitpick: JSON 2.0 is not a thing. JSON is versionless, maybe you are thinking of JSON5 (which is not a version, it's a superset of JSON with ECMAscript 5 features) or JSONC (also not a version, also a superset of JSON, but this time with comments). JSON-LD and JSON-API are both specs for the contents of JSON, not the dataformat itself.


https://www.npmjs.com/package/dotenv is the gold standard imho. they have a few other packages you can add-on that expand its functionality. I have a custom package which utilizes dotenv, dotenv-expand, and find-up for a truly delightful DX.


I've only ever used .env as such:

    export TWILIO_SID="123123123123"
    export TWILIO_SECRET="123123123123"
And then `source .env` in my terminal, then `rails server` or whatever to run my apps. The environment variables will be present to be consumed from within the app.

Curious what features you use beyond this.


> Curious what features you use beyond this.

There’s plenty of outstanding questions.

Is the export required?

Can variables reference other variables?

Can variables reference existing variables in the environment?

Can you use other forms of expressions?

Can you set overridable defaults for values?


I've never used/needed those features from my .env files that's why I'm asking for use cases. Are you saying you use those kinds of features in python_dotenv? I'm asking btw, no snark.


I use them in python_dotenv for declaring environments that I use to provision infrastructure, as an example:

Example, where we have:

env_type="dev"

env_name="myappenv"

primary_subnet_pod="${env_type}-${env_name}-gke-pod-subnet"

primary_subnet_service="${env_type}-${env_name}-gke-service-subnet"

Which I then might want to use further down, override with defaults, etc and so forth. It strikes a decent balance in that it's a bit more powerful than a static .env file, but doesn't require you to go full-on templating language with Jinja and the like. It's also cross-platform enough in that you could throw a .env in somewhere and expect most languages to have a client library to parse and load it into env.

If your use-case is right in that sweet spot it's a pretty good tool. But the behavior, as noted, isn't specced so I can't trust go_dotenv or rust_dotenv or whatever other library to treat it the same way.

Interestingly, in practice, it seems like the Node lib is the canonical version that inspired these other ones so in a way it serves as some sort of unofficial spec. It's not something I would write production grade code trusting the unofficial spec on though.


It's a shell script that should be executable by any UNIX shell. Parsing it (rather than executing it) is why those questions are raised.


In the nodejs community it's normally used via dotenv.

So, yes it's parsed and a valid use case.


If your env file is a script that you source manually, I’d call it env.sh instead.

Historically this has been my approach, but in production environments it’s convenient to have a static list of variables with no “export” instead since eg systemd’s EnvironmentFile only supports that.


.envs generally aren't full blown shell, so you don't have `export` available. With that said, I keep a shell function in my .zshrc which, when given a .env file (or when not given one, assuming $PWD/.env) will set -a, source it and set +a, which loads everything into the environment.


I always prefer to use the dotenv gem, as many times I'd rather stay local to my Rails app, rather than setting global environment variables.


Can you provide a concrete example? What do you mean by other software not parsing it the same way?


  FOO="abc
  BAR=123"
Is this one or two vars? What is the value of FOO?


I've never seen something like this in an env file. No one should be putting lines like this in an env file unless it's by mistake.


Your personal opinion of how esoteric or weird this is is besides the point - there are fundamentally incompatible interpretations where I can bet that switching will be breaking and potentially dangerous for some out there.


Some execute them as a shell script and others parse them as a newline separated list of key value pairs


Or we need to collectively stop using dotenv in favor of other, more formal formats like TOML.


I like ASP.NET Core approach of having an `appsettings.json` where you just specify your settings, and also allowing to have `user-secrets` on a different directory, which prevents you from accidentally adding it to source control (which happens way more than it should). Of course, you can also use environment variables, but creating a JSON file that allows nesting properties is quite nice and probably easier for noobs than figuring out each .env implementation's quirks.


If you’re ultimately going to be reading actual env vars, the nice things that toml gives you won’t be there, and having used it as a crutch may leave you with bugs. If you’re reading env as input, eat the pain early or eat it later.

If you’re not ultimately going to be reading actual env vars, then yeah—why even consider .env in the first place? Of course don’t use it.


> If you’re ultimately going to be reading actual env vars

I would argue that most of the time, you don't need to do this in the first place. Far too often people shoehorn things into environment variables that really shouldn't live there, like secrets and app configuration.

If you're interfacing with existing software that reads from the environment, obviously you don't have control over that, so using a different format is out of the question (unless you're willing to convert it to env vars at runtime, but that sounds like more effort than it's worth).


Kinda surprising it took this long. I really hope Deno continues to grow in popularity. I greatly prefer the "batteries-included" and Typescript-first approach.


I haven't used Deno but have had an easy time using ts-node.

https://typestrong.org/ts-node/docs/


I still find `ts-node` gets in my way often with configuration related issues. `tsx` on the other hand has been a dream.


I've been eyeing ts-node because I'm sick of all the compilation like ceremonies i need to go through for a supposedly JIT compiled language, i.e. JS. How stable is it for production use?


with --swc it's damn fast and very stable. ts-node doesn't recommend you use ts-node in production, however I've seen it done without much consequence.


I use it for a one-person personal project so hard to call that production but it's been really great. Basically seamless typescript in node as far as I've been concerned.


Why would they add built-in support when there's already a reliable well-supported package? Feels like bloatware and waste of resources. I'm sure there are lots of core functionalities that could be improved and optimised instead


There is a clear trend to absorb popular packages into the core and I believe the main reason is to reduce the number of dependencies. Deno, for example, goes on this direction. Given the recent security issues with npm packages, I think it's a good idea to reduce the number of dependencies.


According to npm, dotenv has zero dependencies. That will reduce the usual dependency count from 12345 to a mind-blowing 12344.


Bloating runtimes seem like the wrong solution to this security issue though, and Node.js has security issues too. A lot of people don't ever need .env files, yet that will now be part of their deployment which will increase the attack surface of their application.


A well-supported package, plus several tools that you can learn once and use the same for any language or combo of languages. And those will still work when you have to git-bisect back to a version of your project that doesn’t have built-in .env setting.

Shipping this specific thing with nodejs seems deeply pointless to me.

Plus, having my program auto-monkey with its own environment, even optionally, feels… wrong. I should define it (or something should on my behalf) and the program should accept what it gets. Else, why even use env vars? Just use a config toml or some shit. But that part reasonable people could disagree over.


https://github.com/nodejs/node/pull/48890 provides the relevant context in some comment.


because "batteries included" is a good thing.


Is it just me or does anyone else hate .env files? They accumulate unused cruft, it's not typed, and typically many systems in the same repo point at the same .env file. It all feels so ... implicit ... I guess.

For example in a code base I work in, docker-compose uses it, laravel picks up the same file and Vue / webpack does too. It's a big mess. There must be better solutions out there.


> it's not typed

It kind of is, environment variables can only be strings, so why "type" it when you 100% know that everything within is 100% a String, always?


It’d be useful to have actual booleans and nuns.

Strings are fine for keys and verbatim configs, but suck for flags/toggles/options.


But you can't set a boolean or number to the variable. Environment variables are always strings.


Doesn't seem very hard to work around. You just need to format values.

* "[String|any old content]" * "[Boolean|true]" * "[Number|1]"


There’s no guarantee one is set right?


There’s also no guarantee the application will be able to reach a database, bind to a port, or read something off of a file system.


You can use validator like populate-env[1] to check for missing env variables

[1] https://www.npmjs.com/package/populate-env


A type system doesn't really solve that issue tho.


A "runtime type system" does, meaning something like io-ts or zod.

It also allows statically typing and transforming env strings into much more useful and complex structures.


No more so than that any other kind of config file is present and complete, I suppose.


I don't know that this is the best way, but I use a module that instantiates a proxy object. It populates the config object behind the proxy at startup and crashes with a useful message if a value is missing. If something tries to get an undefined value at runtime, the proxy also throws an error. This way the module documents the config, and the app crashes if something is missing rather than silently trying to run with an undefined value. You can also use object.freeze or what have you to make the config object immutable for most practical purposes.


I don't know if that's the best way either, but it's the model all the most successful engineering cultures I've been part of have used, which certainly says a lot for it in my view at least.


Typescript types them to be string | undefined.


I personally prefer to stick to config files in whatever format is native to ecosystem (toml, yaml, ini). Env files fill the same role for apps which are configured by env variables, but you don't really need to use env variables there. It was forced by Docker some time ago as a single supported configuration format, but nowadays k8s happily accepts configs and mounts them as a file, so why jump the hoops with .env -> vars -> app?


I personally don't feel the same, I have no idea why you'd want a typed environment variable setup. And I find they work great.

I don't see any problem with 4 different things using it.

The point of it is basic, rather then VARIABLE=WHATEVER you just copy and paste a .env and it picks it up.

But you can also just use regular environment variables if you want.


> I have no idea why you'd want a typed environment variable setup.

Because you want to know what type your variables are?

Of course, in this case you do know, they're strings.


You can trivially write a schema file and a method that checks the existence and type of the env var on startup. I actually did on some projects. It's not needed most of the time so I could only see this as an npm module.


Manually writing reliable file IO, serialisation and schema validation on every project is a colossal waste of time effort and energy collectively.

I for one am glad that with 2 lines of code I get type safe environment variables with config overrides in go, and never have to think about it ever again.


Colossal waste?

- have a .json with format [{ name: AUTH_TOKEN, type: String }] (you can add extras like regex match etc)

- have a method that goes through all items on startup and checks process.env existence plus the type.

- use this method as a getter method based on name

- search replace all other process.env usage and disable it via linting so that no one can get around the restriction

voila.

Oooor use these libs from other people who had the same idea (I just found them as well)

https://www.npmjs.com/package/envalid

https://www.npmjs.com/package/env-vars-validator


I like them for defining a small number of env vars for my app that only get injected at run time. I don't like them when other random tools that aren't my app use them. Docker's insistence on automatically using the .env in your home directory is absurd.


The reason why we continue to use those is that they are easy to load in bash (source the file), can be hidden away in the env in production, don't have inheritance, and aren't yaml.

I feel you on the typing, but the format could accomodate such a tool. Code away! :)


Environment variables are all strings. That’s not .env files’ faults, it’s how env vars work.

If you don’t want/need to read env vars… why use .env? If you do need to, then you’re stuck treating them as string input, sure, but that’s got nothing to do with .env files. They just help set env vars. If you get rid of your .env file but are still reading from env vars, all you’ve done is tie one hand behind your back.


Not perfect, but you can use a validation library to explicitly pull in ENV variables and they will be typed. https://sergiodxa.com/articles/using-zod-to-safely-read-env-...


You can create a class or something that reads it in at application startup and throws up if something you expect isn't there. You can't really enforce a rule to not access `process.env` directly, but it at leasts gives you types to access the env should you choose to use the wrapper.


I have utilized this to decent effect: https://github.com/eslint-community/eslint-plugin-n/blob/mas... and then just had agreement/dictator state that the environment loader module was the only place in code that was allowed to disable that rule.


yeah I don't like environment variables because it's all stringly typed and the lack of a thing you deserialize the config file into means that practically speaking, you can read the values from anywhere in a program; figuring out what's permittable in environment variables is very often a disorganized mess. .env files take all of the messiness of environment variables and then ... put them into a file, which is typically the thing that people who are using environment variables are trying to avoid, so ... I dunno, just use a configuration file imho, .env files are the worst of both worlds.


You can easily have both .env and types. On any TypeScript project you can just pass your process.env through an io-ts, zod or similar library and keep it in a side-effectful module which validates on first import and keeps it globally available.


Wait until these folks realize all the dozens of applications and daemons their program needs to build & run are all taking lots and lots of string input and performing various tests and transformations on it to ensure correctness and do type conversion. It’s totally normal and happens constantly any time you’re using a computer.

Is this attitude/phobia a consequence of folks not starting their programmer journey with simple command line programs, anymore?


I've used env-smart[1] for typed .env files before. It works OK.

[1] https://www.npmjs.com/package/env-smart


This is what Dhall is designed to solve. If I was on a big team I’d be looking closely at it https://dhall-lang.org/


The "Can you spot the mistake?" thing just instantly killed any interest I had in that presentation. It's the job of that page to sell me on it, not to try and make me do homework.


> Is it just me or does anyone else hate .env files? They accumulate unused cruft, it's not typed

Try cuelang.org


When will React env variables work on docker images without any bullshit hacks? I mean when I have my frontend and backend in separate pods, use nginx to serve the frontend in kubernetes and want use BACKEND_URL as an env. Why is it so goddamn hard. Any high IQ people have tips on how to get this to work without some weird injections before building the image? Me dumb.


I'm honestly a bit confused by this statement, as "React env variables" can mean various things, depending on your setup.

Backend and frontend can be separate runtimes... The backend could be an API-only-server and an OS process with all permissions that come with it (reading env vars during execution as it pleases). The frontend, eg. a React SPA, could just be a bunch of built HTML/CSS/JS-files (type-checked, bundled, minified, etc.), served through a static file server (separate backend, nginx?) and interpreted through a browser engine on the client.

Are you asking for nginx to parse the served HTML/CSS/JS-files and replace unknown placeholders with env var values? If so you are prob asking for a SSR framework[0], meaning a backend that treats a React project similar to template files, and is able to inject env vars?

[0]: Next.js - https://nextjs.org/docs/app/building-your-application/config...


There are two things that make this annoying:

* most people are pre-building their packages. That means you can’t really modify anything at run time. It also means it can be extremely tedious to test changes and wait for the build process.

* there’s a string of places you have to properly “expose” ENV variables to a layer deeper down. This includes some build condos to make sure you’re not exposing sensitive BE configs. It can be challenging debugging the intermediate steps.


I'm not 100% sure the problem you're trying to solve, but I've solved a similar sounding one that may be of use to you too.

We needed to set env vars on React applications (the frontend) inside Docker containers. The containers were going to be built in CI/CD, and we didn't want different builds for each environment.

We solved this by adding a small JS script that is included in the base HTML. It pushes env vars into the window object, so they're accessible from window.env.

The JS file is generated by a Bash script that's ran whenever the container starts (by modifying the entry-point.sh file used by Nginx's Docker image).

The Bash script loads enumerates any environment variable with a given prefix.

Docker already allows you to set runtime environment variables from the run command, or via Docker Compose.

Im the end, the developer experience comes down to "docker run image:tag -e PREFIX_FOO=BAR".


Might be misunderstanding the problem, but what’s stopping you from using env variables like you specified? Create React App has this baked in by prefixing env variables with REACT_APP [1]; I’m assuming other configurations can achieve the same.

[1] https://create-react-app.dev/docs/adding-custom-environment-...


hen I see comments like you, Im so glad Kubernetes exist, because I know some of our competitors may be wasting precious cycles trying to make the tech work instead of shipping value.

I pretentiously claim that 90% of people using Kubernetes should write efficient algorithms, db requests and maaaaybe look at a CDN and/or buy a bigger server. Like, are your servers really that busy? Stack overflow is 9 servers.... I can testify of some very large, compute intensive service being run on just a couple of servers with the right tech and less abstraction :)


Who do you think I am? I can just hardcode the values and I waste no time on this. However, it's not an elegant solution, so that's why I'm asking. Also, I wouldn't even be using kubernetes normally but this one project just happens to require it for 3rd party reasons that are out of my control. I'm sure you feel very smug but that's just delusion for the most part.


Note that there is no .env file format standard and different tools (Node, Docker, etc) have different formats.

Shdotenv helps to convert/reuse environment files across tools

https://github.com/ko1nksm/shdotenv


havent dug further in, wonder if it will support multiple files. we use two files, one for things that everyone should have in common and one for things that are different per individual and not committed


It should. .env.local is a common pattern


Reminds me of vscode never being able to automatically syntax highight .env.local or similar… or is there some setting im missing?


`.env.*` files won't highlight in either VSCode or Neovim, but it works fine with `.*.env`. I guess it's some kind of convention.


it's because syntax highlighting rules are applied based on the file extension. as long as .env is the extension, it'll work. if the file ends with .local, vscode has no idea that it's supposed to be a .env file type


Yes sure, but since .env.local and .env.prod are so widespread one could make a small filter to also include these types of files, no?


Especially considering the GitHub .gitignore template for Node only ignores .env.local, not .local.env: https://github.com/github/gitignore/blob/main/Node.gitignore...


am I still the only one who thinks storing all of your secrets in plain text while using a package manager that allows arbitrary code execution is a terrible idea?


Nope, but if your dependencies are reasonable and don't do a bunch of stuff in pre/post-install you can disable it with `--ignore-scripts`.

IMO it would be much better if that was the default and dependencies had to ask to run scripts (perhaps with a whitelist in package.json), but unfortunately node did a bunch of "helpful" mistakes and now it's hard to roll back without breaking.

Maybe if they had denos permission model they could isolate so that the dependency could only read/write to it's own directory within node_modules.


What’s the motivation? I don’t want multiple things reading the file if I’m not working solely in nodejs—I want one tool reading it and setting env vars for all the rest—plus plenty of tools already exist to do this and most are trivial to install and use, and overall it seems like another way for things to go wrong and just a platform-specific Yet Another X burning person-hours to… what benefit?


I'm sure they will find a way to do it securely and not have user uploading wrong name of file fuck up the security of the app..


But why though? .env (the library) has never made sense to me. There should be a singular binary called dotenv that then takes a command+args as arguments that then runs that command with the .env file's values in the environment. Why do runtimes need first-class support for this?


Not a big deal! It will not change much and you may still end up using a specific, cross-platform library anyway.


Environment variables should never be used. This was a known security hole already in the last century. Several OS solved this by config files that comes with file permissions. The correct solution on Linux and Windows at least. And still people use envs.


That's unrealistic and not scalable. Heroku and the 12factor pattern use this because it creates a boundary between values meant to be configured by the user and those internal to an application. Environment variables can be injected into a subprocess without them ever existing on disk.


Envs are not process secure. Files are. Ram disks exists.


Why? Seems like it goes against the Unix principle. I've been using direnv to set environment for any software, containers, python virtualenvs and more, and it's working quite well. Will nodejs now conflict my direnv files?


Will it have support for "VARIABLE=${VARIABLE:-default}"?


No


Unfortunate, thank you


"node_modules" <mutter>




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: