We do webgl rendering of big graphs as part of Graphistry, and before the 20M nodes break the system, the ~10X more edges will ;+)
But more seriously, a careful webgl implementation gets you maybe halfway there, and most I've seen break well before then. It gets tricky with browser js memory limits and raw perf. We are looking into hitting more 100M level with interactive speeds... But takes creativity & opts I wouldn't expect for a diagramming tool :) and, if your kind of thing, we are hiring :)
And there's a link to a gallery with even bigger ones, so #NumberOfNodes alone does not seem to be the main limiting metric. Maybe combined with the choice of rendering layout.