Hacker News new | past | comments | ask | show | jobs | submit | AldousHaxley's comments login

Also: https://www.formgraph.com - a bit richer (multiple layers, drawing tools, social features, etc)


Cool! I built something similar a while back. A bit more structured (creating accounts and such), but with a bigger variety of drawing tools, and some social features like following, project collaboration, upvotes/downvotes, and chat. https://www.formgraph.com


https://www.formgraph.com - real-time collaborative drawing in vue.js; drawing client alone is over 5k LOC of Vue.js.


Oh this is nice, do you work there? Wondering the experience the devs had making this.


I wish I knew earlier that I had enough knowledge to build production quality applications. I had this idea early on that I had to attain some level of mastery that was always just outside of my reach, and that the only remedy to that was learning everything possible under the sun. But after spending years trying to get to that point, I realized that I could have been building and shipping products that whole time. If I could do the last decade over, that realization would have the most profound impact.


Thanks for this comment! I appreciate it as it's something that I've been suspecting for the past year.


I remember those anxious days before Longhorn's release. MS was making a lot of weird threats back then. All the "trusted computing" stuff, as well as the thinly veiled attempts at killing OpenGL on Windows. Despite my above waxing nostalgic about the API, I haven't built anything with D3D in years, for precisely the issues you mentioned. API design quality and performance are important, but they are far from the only factors that go into making such a decision.


My favorite is how I was able to guess from Intel's claims that the 915 chipset can't support WDDM because it don't have a "hardware scheduler" that it is probably referring to the "lost devices" in old DirectX that many programmers likely remember.

Edit: Old Slashdot thread: https://slashdot.org/comments.pl?sid=4123465&cid=44660047


D3D9 is my all-time favorite 3D API. It was at that sweet spot of being powerful enough that you could do stunning things with it (the Xbox 360 generation is effectively D3D9-level hardware), and it was simple enough that amateurs and hobbyists (and high school kids, as I was at the time) could dive in and get stuff on the screen relatively quickly. D3DX had some handy helper libraries for loading models, doing transforms, and loading textures. I get the benefits afforded by subsequent iterations, and nowadays Vulkan and D3D12, but their interfaces are so optimized toward specialists, and it would be nice if more was in place to cover that middle ground use case between "I don't need a full-on 3D game engine, but I don't need to worry about low level GPU stuff either." Of the new APIs out there, I feel like Metal is the only one approaching layperson usability, and it's trapped in the world of Mac and iOS. Of course you can stitch together something that looks like an open source equivalent of the old D3D9 SDK yourself from things like glm and AssImp, but it's nice to have all that in one package.


Same here, but replace D3D9 with OpenGL 2.x or "compatibility profile" as kids call it these days. Basically the entire OpenGL instead of the crippled "core profile" that was originally made to give a chance to troubled implementors (coughATIcough) make something that works (and failed, while we still ended up with the schism).

Fortunately this is still supported in its entirety by all sane desktop implementations and at least Nvidia has said that they will always continue supporting it. The only sad marks are Apple and Mesa, but Apple seems to have abandoned the OpenGL ship and Mesa (or a fork) will hopefully implement it at some point. And in the meanwhile there is Regal, although i'm not sure how complete that is.


I find that OpenGL ES 3.0 hits the GL 2.x sweet spot for me. Main reason I prefer it over desktop 3.0 is that it doesn't require the use of vbos, which always seem to complicate simple programs (even though I keep being told that I can just set one up and bind it, then forget about it).

I especially like that it's hands-off wrt a lot of things like matrix manipulation and geometry optimization. It gives you an opportunity to think about these things clearly and get an optimal solution without having to port all of your matrix code out of somebody else's library.


Sure, ATI/AMD (and Windows Intel) lagged behind NV in the full OpenGL implementation. But the "compatability" profile is full of a lot of cruft that had nothing to do with modern GPUs, and whose primary value was to passing workstation test suites, not actual modern GPU programming.

I know we have the "but mah immediate mode" crowd, but those folks have to just get off it. Write your own immediate mode recorder if you're so desperate. I'm also looking at you, matrix stack abusers. Your programs are bad, and you should feel bad.


> I know we have the "but mah immediate mode" crowd

I'm in the "but mah immediate mode" crowd, so thanks for the condescending tone. I like immediate mode and matrix stacks and all the nice stuff compatibility mode provides because they are convenient even if they aren't fast. Quite often what you want is convenience, not performance and adding a usually badly documented 3rd party library that might be dropped next year (or whatever) or break its API is not a desirable solution. Especially when the entire point is moot since OpenGL 1.x/2.x and 3.x/4.x compatibility isn't going anywhere any time the next couple of decades - at least.

Besides the parent post is about what someone's favorite API was so i also added what my favorite API is. D3D9 isn't any more deserving to be someone's favorite API than OpenGL.


I would actually argue that it is _bad_ for graphics. The GPU abstraction that OpenGL represents is now 30+ years old, and not at all representative of how modern GPUs (desktop or mobile) work. I'd even argue that OGL obfuscate the GPU machinery to all the budding graphics programmers who use OGL. OGL was never meant to be a graphics toolkit/middleware, but that's what it's become (at least the compat profile).

As far as the value to you, to be perfectly frank, you would not have compatibility profile if it wasn't for workstation ISVs demanding OpenGL 1.1 still be supported for their legacy applications (and shelling out the big bucks to maintain that support). There isn't a singly IHV who is keeping these legacy features on for any hobbyist or learning programmer. Not one. The IHVs know it's bad, but there's a lot of money in keeping it around.

I don't even know if there's value in mentioning this, but I used to be an OpenGL driver engineer, and then when I worked on it, I realized "Hey, wtf, this has nothing to do with how the GPU works!" I'm still happy/proud about the work I did, but I don't think it advanced the graphics state-of-the-art (for the most part, there are some cool things that showed up first in OpenGL). And then as far as the cruft I mentioned...it's just a psychotic maintenance nightmare.


That is all nice but i don't see what is the point of trying to convince me that i shouldn't like something i like :-P. Regardless of how good OpenGL is in representing how GPUs work or why it is still around, it doesn't change the fact that it is still around and i like using it.


If you are talking about OpenGL compat profile, Mesa will not go beyond 3.0. You might or might not be able to convince the project to change that with patches, I am not sure what's the attitude on that.

If you are talking about D3D9, for Gallium driver (relevant ones are r600, radeonsi, nouveau), there is Gallium Nine: https://wiki.ixit.cz/d3d9


Yeah i am considering at some point to try and see what would take to add support for the compatibility profile. I do not see a reason to not have it considering that the functionality is mostly there (and AFAIK there is even an environment variable that sort-of allows the creation of compatibility profiles, it just doesn't fully work). If nothing else, not having it makes the Mesa implementation inferior to the proprietary ones.


D3D has always been a low-level API and D3D9 is no exception. The D3D9 just supported really simple hardware where pixels were pushed through a rigid pipeline one primitive at a time.

Interestingly, Xbox 360 was already beyond DX9, it had real shader units and shader stages (i.e. the same unit could execute both pixel and vertex shader instructions) and even ability to write data back from a shader (if you wanted to use shader to process non-pixel data on the DX9-level hardware you still had to write the output as pixels because the color/depth buffer was the only output of a shader). So X360's graphics library extended DX9 a lot to support that hardware.



Vulcan blows open the doors for anyone to create a library with the power and ease-of-use of DX9, rather than it being tied to OS vendors.


Why not OpenGL 3.2+? It is capable of modern effects and pretty easy to use.


Console developers don't get much choice: You pretty much must use D3D9, D3D11, GNM(X), etc.

Mobile's not much better: OpenGL ES for Android (which is a far cry from real desktop OpenGL - to the point that I consider it a completely different graphics API that just happens to confusingly reuse the names of some functions.) Maybe Metal for iOS.

Real desktop OpenGL is one of the few graphics APIs I've never been forced to use - so why spend even a single dev-month porting to it? Even games with OpenGL render paths may work better using their Direct3D renderers on Linux via Wine!

Even counting Linux/OS X, at this point I'm not even convinced OpenGL is actually the more portable API...


I've written an iOS app and the OpenGL code was (except very few ifdefs) identical to the desktop code. So I wouldn't say that ES is a completely different API, you can certainly reuse a lot.


Not even our shaders went unscathed - no rectangular matrix support (mat4x3? nope!), mandatory precision specifiers (lowp-highp), no Uniform Buffer Objects. Can't even call glTexImage2D on ES 2 without perfectly fine OpenGL code failing - because format !== internalFormat is forbidden (read: documented "must be the same"), and you were a good explicit OpenGL citizen and asked for something as horrifically complicated as format=GL_RGBA and internalFormat==GL_RGBA8. Multiple render targets? Are you out of your mind? We can't have that - goodbye deferred rendering, hello old school forward rendering!

Even your bread and butter - functions like glUniformMatrix3fv do things like just outright ignore the transpose parameter. This is extremely well documented: "Specifies whether to transpose the matrix as the values are loaded into the uniform variable. Must be GL_FALSE." ( https://www.khronos.org/registry/OpenGL-Refpages/es2.0/xhtml... )

I swear I've reused more code between D3D11 and PS4 codepaths than between OpenGL ES and OpenGL codepaths, and the d3d11 and ps4 APIs don't share a single common function name betwixt them!

I must assume going from OpenGL ES 2 to real OpenGL a bit easier - even if you're probably giving all the fast paths in the latter a wide berth in doing so. Or maybe OpenGL ES 3 is a bit closer to real OpenGL. But OpenGL ES 2 vs desktop OpenGL? They both render triangles, but beyond that, it's a crap-shot, and a lot of #ifdefs in my experience. Separate files, even. Not just a few.

EDIT: s/codepaths/apis/, finish summarizing my thoughts.


Yes, calling OpenGL ES "OpenGL" compatible is only true to the ears of those that never had the "fun" to write portable code across multiple GPUs.

At the end of the day, the code paths, extension support and driver workarounds are so many that one could just be coding against multiple APIs anyway.


I guess you're right, as I was only using rather basic stuff (e.g. no lightning, no shaders), just framebuffers, VBOs, etc.

(replied to my own comment by accident first)


I guess you're right, as I was only using rather basic stuff (e.g. no lightning, no shaders), just framebuffers, VBOs, etc.


Because for fonts, textures, windowing, effects, materials, requires the use of third party libraries outside the spec.

Also it is stuck in a C world API, with each binding being a third party library with different degrees of coverage.


All of that is outside of the spec. That has nothing to do with 3D graphics. MS just lumped it all together and called it DX.


I think that is kind of the point.


Well, it's a counter point really, since all of that is unportable (without such kind of translation).

As developers put it here[1]: "Don't make a game that depends on Direct3D. All the hard hard work is getting the thing to run with OpenGL".

1. https://www.gamingonlinux.com/articles/about-linux-games-bei...


That's stupid. Competent 3d on Windows is DirectX. Competent 3d on Mac/ios is Metal. Competent 3d on consoles uses their propriatary API. So, what is the point of OpenGL portability? Between Linux and Android? You _will_ have to switch API if you are making anything worthy.


That's some impressive graphics fetishism on display there. It's hard not to take offense at the comment that anything less is not "worthy". Not everyone is a AAA studio with an engine development team, nor should everyone hold themselves to those standards of engine performance.

OpenGL 3.3 is a pretty nice target. It's easy to port an OpenGL 3.3 game to a lot of systems, including mobile, since OpenGL ES 2.1 is pretty similar, and Windows XP, which cannot run DirectX 10. (This matters less as time goes on, but it's been an important point in certain markets in Asia.) So you can write the graphics code once, more or less, and run it everywhere, more or less.

The idea that somebody who chooses to do this is therefore not "competent" just boggles the mind. No point in deriding engineers who decide not to use the latest bells and whistles in the graphics pipeline. Maybe they just have other priorities.

(To clarify: the recommendation that everyone should just use OpenGL regardless of situation is, in fact, stupid. So I agree with that part. But it's equally stupid to say that everyone should use D3D12 / Metal regardless of circumstance.)


> That's some impressive graphics fetishism on display there.

More like compatibility and stability fetishism. nVidia and AMD can't even agree on how to compile the same GLSL shader - I'm much more confident about my ability to ship working D3D9 or D3D11 than OpenGL without a QA team with a wide range of hardware and driver revisions. D3D? Uniform bytecode. Uniform debug layers. Nice. Being indie just makes it easier - simpler rendering pipelines, easier to port.

For Android dev, where I have no choice about using e.g. OpenGL ES 2, the compatability mess is so bad that even for solo dev, I've been eyeing AWS Device Farm and Xamarin Test Cloud. Maybe AAA studios can afford the QA time - but I can't even afford enough phones to test to my satisfaction. And in my heart of hearts, I blame OpenGL, even if it's really the fault of mobile GPU vendors. I have a much weaker urge for a PC device farm, where D3D mostly just works. It'd be a much stronger urge if you threatened me with the prospect of supporting desktop OpenGL.

Biggest company I've worked for had about 50 people, usually on a much smaller team. I don't think that's AAA. Wasting milliseconds left and right, we didn't need much per-platform tuning - still lower hanging fruit around. Still almost agree with euos. Aside from compat - getting OpenGL working on a console or in the WinRT sandbox is probably more work and worse results than just doing a straight up port. Worst of all worlds.

> The idea that somebody who chooses to do this is therefore not "competent" just boggles the mind. No point in deriding engineers who decide not to use the latest bells and whistles in the graphics pipeline. Maybe they just have other priorities.

Agreed - competent engineers and managers may decide that competent 3d support not worth their time. I welcome this restraint when it comes to Excel's pie charts and the good fight against scope creep. If you're targeting the holy trifecta of Windows, Linux, OS X, and little else - OpenGL might be right for your MVP and your launch window.

Yet I'd still be eyeing that OpenGL on Windows as possible technical debt. Hell, I basically look at OpenGL ES on Android as unsolvable technical debt. I'd have a hard time labeling it "competent 3d". And I'd be wondering if it was really better than D3D9 + Wine.

Metal might be a little flavor-of-the minute. I need to give it a shot sometime...


For uniform bytecode and standard validation there is Vulkan now.


> OpenGL 3.3 is a pretty nice target

Intel HD 3000 (6 years old, much newer then Windows XP) only supports OpenGL 3.1 on Windows.

VmWare workstation only supports OpenGL 3.0 for Win7 guests.

Direct3D 11 works flawlessly on both of them.


Good luck getting that Direct3D to run on Os X or Linux. Without some hack solution like Wine.


You are assuming one wants to do it.


Clearly some do. That's why Wine is working on it (FOSS) and there are closed translation layers too, like from Feral and VP.

I obviously mean DX11 on Linux for running Windows games, not DX11 on Sandy Bridge.


Intel HD 3000 has better OpenGL support on Linux (thanks to Mesa). It's at 4.0 with software implementation of ARB_gpu_shader_fp64. And I doubt it supports all of DX11 properly as well.

I don't think you need to worry about that when gaming is concerned though. It's too old and below minimum requirements of huge amount of games already.


Yep, on Linux and OSX it supports OpenGL 3.3.

The hardware fully supports DX 10.1. DX 11 API works fine on that, with feature level 10.1.

> I don't think you need to worry about that when gaming is concerned though

That’s correct for AAA titles. Casual gamers however often play on PCs without a dedicated GPU.


> That’s correct for AAA titles. Casual gamers however often play on PCs without a dedicated GPU.

Even in such case, they wouldn't commonly use Sandy Bridge generation GPU. And those who use it, aren't expecting recent games to support it (whether they are demanding or not). Increasing number of games already require OpenGL 4.x, even if they are not very demanding in practice. And now with Vulkan, older Intel GPUs simply won't cut it anymore.


Choose OpenGL 3.3, you’ll loose gamers running Windows on Sandy Bridge GPU.

Choose Direct3D 11, you’ll loose Linux gamers.

Are you sure in absolute numbers, there’re more people in the second group?


Modern engines will use Vulkan anyway.


Maybe they will, maybe not.

Before it happened, software developers need to choose whichever GPU API works best for their particular project.

Personally, I have not developed games for several years.

For the last 1.5 years, I’ve been working on a CAD/CAM software. Traditionally people use OpenGL for this area. I have picked D3D 11 instead. The renderer is reasonably sophisticated; there are many complex shaders in there. The software is now used by thousands customers worldwide, and yet there were very few rendering-related bugs so far.


That's smart. Those who don't think about it in advance, are bitten by this later. Read the article above.

Metal is Apple's lame attempt to tax cross platform graphics developers. That's exactly what is emphasized there.


The article you linked to is advice from people who port games to Linux on how to make games more portable. It's not advice on how to ship them on time, how to make them performant, how to minimize the amount of developer labor required to build them. Given what a small share of the market Linux is for games, developers have other priorities than what makes games easy to port to Linux. And it's not a chicken-and-egg problem; the Linux market for games isn't much smaller than the Linux market for any other commercial software.


Performance is orthogonal issue to portability. You can make it portable, and perform well too (using something like Vulkan and proper engine design).

Linux gaming market is growing. Not sure about other commercial software in this context, but I'd guess it can indirectly be affected too, since bigger gaming market makes Linux desktop usage grow in general.


So, "Here, use a different API with worse developer support, and in return you get access to a market that's ~1-2% the size of the Windows gaming market?"

I mean, I'm all for OpenGL. I use OpenGL. Hell, I develop OpenGL games on Linux and then port them to Windows afterwards (released ~10 made that way so far). But the choice to use it depends on what market you're going after.


Vulkan has better performance than DX11, and it has literally twice the audience of DX12 - DX12 only supports Windows 10, so any W7/W8/etc users are SOL unless you use DX11 or Vulkan.



This is irrelevant. With all major engines gaining Vulkan support this list will baloon fast.


A good gaming engine supports multiple graphics APIs, making specific ones irrelevant.


My point was, adoption in engines is relatively slow, but once it happens, things become easier. So current usage lists aren't a reflection of general progress. Unity for example gained Vulkan support in the latest version which came out recently, and Unreal is still not there (but already close). Same goes for Lumberyard.

Of course the tax on development imposed by lock-in freaks translates into that slowness. I.e. as you said, need for engines to support many balkanized APIs means slower releases to the market.


Isn't it what MS is saying for DX12? Except Vulkan is not locked into MS only stuff.


The cost of porting between DX12 and Vulkan is not so high, not anything like the cost of porting between OpenGL and DX11.


In the abstract, performance may be unrelated to portability. But if tooling, drivers and expertise are all focused on DirectX, and if getting OpenGL to do everything DirectX can do requires using vendor-specific additions to the spec that are non-portable, then in practice, it may be harder to make OpenGL performant across multiple vendor's GPUs.


Seriously, anyone who cares about cross-platform compatibility is already using Unity or Unreal.


Not everyone, but probably quite many.


I'm going to agree with cwyers here. The choice of which graphics API to use is not automatic, and "just use OpenGL, dummy!" just glosses over the fact that different developers have different priorities.

OpenGL gets you desktop and mobile all in one fell swoop (well, with a lot of tinkering and hard work). If that's your market, sure, go ahead. If you're writing a higher-end game targeting consoles and PC, then you'll end up doing a lot of extra work porting your engine to consoles and in return have access to a much larger market. Those consoles don't really support OpenGL, at least not as a first-class citizen, but one of them does support Direct3D. Direct3D also has excellent developer support.

Given the good support, good tool integration, and the fact that Direct3D runs on, say, two of your three most important platforms, it's a good choice.


> Those consoles don't really support OpenGL, at least not as a first-class citizen

Switch supports it now as first class citizen, and Vulkan as well. Nintendo were first to straighten this up. MS and Sony are still in the dark ages.


That rhetoric about "the dark ages" is doing nobody any favors, it almost makes it sounds like Sony uses tabs in their source code or something even worse. Is this some kind of holy war, OpenGL versus the forces of darkness?

To clarify, yes, I was talking about PS4 and Xbox One. Those two systems, plus Windows, are the primary market for most AAA games. Just do a search for "2017 video games" and you'll see a big chunk of games that only run on those three systems. OpenGL isn't a very compelling choice there.


> That rhetoric about "the dark ages" is doing nobody any favors

I'm calling it what it is. Their refusal to support Vulkan shows they intend to remain in the dark ages, and continue forcing developers to operate with balkanized non portable APIs. It's like some browser maker would refuse to support HTML and would require you to use ActiveX, Flash or whatever.


"I'm calling it what it is," is a poor excuse for poor rhetoric. Balkanized APIs are a historical fact. Khronos wasn't exactly quick to support modern hardware in the 2000s, everyone paid the price for it, Microsoft picked up the slack, and here we are with developers entrenched in the Direct3D ecosystem. Vulkan arrived about a year ago and that's not exactly a lot of time to sweep away all the inferior APIs before it. These things take time. The engine developers are figuring out how to use Vulkan, and Vulkan support is spreading. It's going to take more than a couple years.

Meanwhile, Microsoft is managing their profitable Windows business by supporting key app developers, including game studios, who are already happily using Direct3D. Are they going to axe it and piss off a valuable segment of developers? No.

The analogy with Flash is a pretty good one. Flash was used everywhere on the web for years and years. Then the iPhone sucked all the oxygen out of the room and Flash died.

Just like everyone switched from Flash to JavaScript so they could get their websites to run on the iPhone, maybe everyone will switch from Direct3D to Vulkan for the same reason. Flash took a long time to die. After Apple announced the iPhone would not support flash, it was five years later that YouTube stopped using it for video. And yes, there were a lot of good reasons to require Flash in the meantime, until the open web caught up.


> Balkanized APIs are a historical fact.

Dark ages were long and historic too, which doesn't mean they didn't have a lot of problems :) So I find it a proper comparison. Those who insist on keeping things balkanized today (MS, Sony and Co), are slowing down the progress.

> maybe everyone will switch from Direct3D to Vulkan for the same reason.

I definitely hope so. Current messed up state shouldn't exist forever.


So a brand-new console from a company that has an abysmal record at working with third-party developers?

I kind of wish consoles would just go away, so we could focus on PCs.


Actually as second class citizens to help porting titles to it.

Switch first class citizen is called NVN.

https://blogs.nvidia.com/blog/2016/10/20/nintendo-switch

Also, MonoGame, Unreal and Unity are official supported middleware.


Also on Windows, Direct3D is better supported.

If you write an app based on Direct3D 9 or 11, it usually just works on any supported version of Windows.

I once wrote an app that uses OpenGL 4.0 (this one: https://github.com/Const-me/GL3Windows), and it only worked on half of my PCs. To make it work everywhere, I had to downgrade to OpenGL 3.0, and also implement S3 texture decompression on the CPU.


GL 3.x is my current go to for the case of needing simple immediate mode 3D rendering. But it's still Bring Your Own Matrices/Texture Loading/Text Rendering/Model Loading, and D3D9 + D3DX had that all in a nice package. Like I said, stitching this stuff together isn't the hardest thing ever, but it's nice to have a basic set of boilerplate that's idiomatically consistent and works out of the box.


Because OpenGL is a dumpster fire of conflicting information for a beginner. I've been looking for a decent opengl book for years, in vain. DirectX 9 was sort of the sweet spot, where the technology stabilized for a few years at something decent. Nonetheless, aside from dumping the useful parts of d3dx, DirectX 11 is a lot friendlier to work with.


Doesn't a 3D graphics (not game) engine like OGRE or Irrlicht fill this void? I don't know if either one supports DX12/Vulkan yet, but I imagine they will in the not too distant future.


I'm thinking more like bgfx, which is a cross platform API to access vertex and index buffers, textures, rendertargets, shaders and render states. Primitives like models, sprites, fonts, etc are up to you but the repo contains examples with extra utilities for those things.


bgfx looks really cool, thanks for letting me know about it!


Those are higher level scene graphs. D3D9 was an immediate mode API with some helper libraries, so you still were in control of your app's rendering architecture.


The original Xbox also ran D3D9. After getting my hands on the XDK for that box I had a lot of fun with it, I see your point regarding the sweet spot.


I get the value of garbage collectors, but honestly until Go I wasn't too fond of them, as I'd find myself spending about as much time thinking about memory management in Java as in C. At least with C you don't have the same lack of determinism as you do with traditional GC languages.

I find recent language trends interesting. Moving away from VMs to native binaries, away from Java-style GCs and embracing ARC and smart pointers. I have a strong bias toward minimalism, so I like the trend. If it serves to remind people that CPU time and memory, while incredibly cheap, isn't free, then all the better.


This is a really encouraging trend. For all the things I love about iOS (not to mention all the years supporting myself on iOS dev work), I love the fact that on desktop platforms, developers can just write a program, throw it up on the Internet, and have people use it or even pay for it without any middleman.

I don't have some deep moral objection to software walled gardens (if that fits your use requirements, more power to you!), but if that model ever wins out, it will be a sad day for innovation. Software's one of the most powerful tools in recent history for disruption, and having a centralized entity wield control over that creates a dangerous bottleneck for this relatively new, extremely powerful medium for human expression. Good to see that, on at least some level, the market agrees with me.


Yeah, collaborative drawing is kind of old hat, but when you can use the context of the modern, social web to provide some new modes of interaction around it, it can be interesting again. Same applies to more mundane things like text.


Not quite the same, but my startup, Formgraph[0], also does public, real-time collaborative drawing. I even did a similar write-up about the stack behind it the other day[1]. Looks like they're relying on Cassandra and Redis. I went with RethinkDB. Should probably do a write-up about the front end in the future.

[0] https://www.formgraph.com

[1] https://medium.com/@JohnWatson/real-time-collaboration-with-...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: