Been a few years since I put NetworkX through its paces, but the several times I have tried it, found remarkably weak support for graph layout and display. NetworkX analytic routines may be strong, but attractively displaying graph-structured problems remarkably more interactive and attractive via d3.js, GraphViz, etc. At least for my problems, communicating graph structures, and having nodes and edges that represent different kinds of things…these are basic requirements, not optional frills.
> Been a few years since I put NetworkX through its paces, but the several times I have tried it, found remarkably weak support for graph layout and display.
Well, yeah, it is pretty open that it is the wrong tool for that job. Here's what the NetworkX documentation [0] says about its visualization support:
NetworkX provides basic functionality for visualizing graphs, but its main goal is to enable graph analysis rather than perform graph visualization. In the future, graph visualization functionality may be removed from NetworkX or only available as an add-on package.
Proper graph visualization is hard, and we highly recommend that people visualize their graphs with tools dedicated to that task. Notable examples of dedicated and fully-featured graph visualization tools are Cytoscape, Gephi, Graphviz and, for LaTeX typesetting, PGF/TikZ. To use these and other such tools, you should export your NetworkX graph into a format that can be read by those tools. For example, Cytoscape can read the GraphML format, and so, networkx.write_graphml(G, path) might be an appropriate choice.
Laying out a graph so it's "friendly to humans" is a seriously hard problem. I've built complex DAG workflow engines using networkx and its layout tools and they worked just fine. But, yeah, I guess it depends on what you need?
Export the graph to GML or to GraphML or to GraphViz DOT or to some other Graph format, and feed it to a dedicated utility. BTW I recommend 3D graph visualization over 2D when possible, that is when you're exploring interactively as opposed to printing figures. The Graphia tool is the only FOSS tool for this purpose that I know of:
You may like my Netgraph library [1], which is a Python library that aims to complement networkx, igraph, and graph-tool with publication-quality visualisations.
Netgraph implements numerous node layout algorithms and several edge routing routines. Uniquely among Python alternatives, it handles networks with multiple components gracefully (which otherwise break most node layout routines), and it post-processes the output of the node layout and edge routing algorithms with several heuristics to increase the interpretability of the visualisation (reduction of overlaps between nodes, edges, and labels; edge crossing minimisation and edge unbundling where applicable). The highly customisable plots are created using Matplotlib, and the resulting Matplotlib objects are exposed in an easily queryable format such that they can be further manipulated and/or animated using standard Matplotlib syntax. Finally, Netgraph also supports interactive changes: with the InteractiveGraph class, nodes and edges can be positioned using the mouse, and the EditableGraph class additionally supports insertion and deletion of nodes and edges as well as their (re-)labelling through standard text-entry.
If I want to visualize a graph, particularly large ones I just dump it out to gexf[0] format and load it into gephi[1]. It kicks back some legacy formatting errors but it doesn't really impact the graph. Gephi also supports temporal based graph analysis which is nice.
I definitely agree. There’s no reason in my opinion not to have an extension library that does good visualization via a force-directed graph or similar. The existing visualization methods are pretty barebones.
I found the documentation for networkx much better than the one from igraph[1] (at least the Python version). However, for community detection algorithms graph-tool[2] is better (it also uses a different class of models than the standard in literature)
NetworkX let me whip up a useful shortest path routing proof of concept from telco data in a few hours. I was impressed with myself, but all glory goes to NetworkX !
Interesting you should say that, as I am trying to start a project where I need to make an electric grid graph, but I am not sure where to find the node/edge data for substations and transmission lines that include their specs and capacities. Is that stuff open source somewhere, like with the ISOs, or do you need to build it from scratch?
That data is privately held by utilities, and the high voltage transmission infrastructure is highly confidential (CEII/NERC CIP). If you just need sample data, I'd recommend checking out the test data provided with power flow simulators like OpenDSS for distribution systems [1] or MATPOWER for generation + transmission [2]. The IEEE test systems are what are used in research, they have the component specs you're looking for, and are provided with those tools.
I have tried to talk to engineers about contingency analyses for what would happen if a unit went down, and they tend to have very wishy washy answers. Or giant tediously compiled reports that can model exactly one change.
Good question. Are you talking about generation, transmission or distribution?
From my experience as an electrical engineer working for a distribution network:
* The traditional approach to network planning: take your edge cases (e.g. winter peak demand), and apply your engineering knowledge and intuition to manually study the most onerous outage conditions.
* This will vary on where you are in the world, but networks tend to have a good amount of slack built in.
* As networks become more complex, and the cost of computing has fallen, it's more feasible to automate contingency analysis (think about the number of different outage combinations for an N-2 scenario).
FWIW, the internal tools that I work on makes use of networkx to determine contingency cases.
~20 years ago, the regulator introduced an incentive scheme to reduce customer interruptions and minutes lost. This resulted in heavy investment in network automation in the UK.
A single unit would correspond to a N-1 case when doing a transmission system study. There are ways of automating steady state analysis for this case to do a full sweep across the nearby system (either looking at k hops away, all parts in a zone (where a zone has a specific meaning in this context), or using a utility provided set of assets for the analysis). This pretty much consists of running a load flow for each individual case and compiling the results while making sure they are valid (convergence, device behavior, etc).
This is only the steady state analysis, but there's also dynamics done when looking at specific generators also to look at a generator's response to fluctuations in voltage and frequency to ensure stability within certain operating conditions (weakening of the grid, rapid change in voltage or freq).
If they were wishy washy they were probably limited to doing distribution where you are assuming a single strong source (swing bus) at the substation and it's not your responsibility to think too much about adjusting the system behavior based off of changes in transmission (usually)
I mean transmission. And by wishy washy I mean they have the reports but it’s not compiled into any sort of useful system so they are not able to quickly answer questions about it.
Most seem to outsource this analysis. I’m curious if you have a sense for how common it is for a transmission utility to really own this kind of analytics?
Yes. It’s generally stable in most localities. “Instabilities” in this system are browns outs power failures and other events.
There are stabilising features within most electricity grids, but they can only cope so much. In general forward planning is down so the amount of dynamic adjustment needed is within allowable range.
But to be honest i don’t know how modern grids have adapted with many more micro generators than in the old days.
See also https://github.com/Qiskit/rustworkx – a general purpose graph library for Python written in Rust to take advantage of the performance and safety that Rust provides.
> Rustworkx was originally called retworkx and was created initially to be a replacement for qiskit's previous (and current) NetworkX usage (hence the original name). The project was originally started to build a faster directed graph to use as the underlying data structure for the DAG at the center of qiskit-terra's transpiler. However, since it's initial introduction the project has grown substantially and now covers all applications that need to work with graphs which includes Qiskit.
I've used recently the networkx algorithm to find Hamiltonian cycles in a graph, in order to generate a Secret Santa with constraints (couples don't send gifts to each other, and people don't give to the same person as last year), it works great even though the problem is NP-complete, since my number of participants is very low.
I've tried the same in Rust with petgraph which resembles networkx, but it doesn't have the algorithm for Hamiltonian built in and I couldn't wrap my head around the DFS/BFS visitor pattern, but I'll continue this some day.
They are distinct problems, because if you have a graph with a Hamiltonian cycle inside, you can add as many edges as you want, the cycle will always be there, but some N-colouring solutions might break.
I've used igraph. While it's much faster, for me at least, modifying the graph once it's constructed is harder compared to network. Haven't worked with cugraph though. As always use the right tool for the job
> pytype (Google) [1], PyAnnotate (Dropbox) [2], and MonkeyType (Instagram) [3] all do dynamic / runtime PEP-484 type annotation type inference [4] to generate type annotations.
Hypothesis generates tests from type annotations; and icontract and pycontracts do runtime type checking.
Back in the days I’ve discovered NetworkX and Gephi in a Coursera course and was really surprised about how simple it managed to represent visually such a hard problem (I’ve never been able to find this course again it started with Erdos number that’s the only thing I remember)
> On-Topic: Anything that good hackers would find interesting. That includes more than hacking and startups. If you had to reduce it to a sentence, the answer might be: anything that gratifies one's intellectual curiosity.
Graph theory underpins nearly everything we do in software development and computer science. Networkx is an expansive -- though not the only -- package for Python that'll solve 90% of people's problems.
yeah I will admit I was conflicted in posting this, but I see a few instances of posts like this, and it was very useful for a recent project. I was really fishing for alternatives though :)
So much "news" these days is just unadulterated crass clickbait, that a friendly reminder to revisit interesting subjects really does qualify as above average "news", yes. Sadly?
[1] "What are the best libraries to work with graphs?" https://www.reddit.com/r/Python/comments/185xexg/what_are_th...