Ready-to-use IDE based on containers and VSIX #45
Labels
discussion
documentation
Improvements or additions to documentation
enhancement
New feature or request
/cc @carlosedp @mithro @cmarqu @mgielda @PiotrZierhoffer @kgugala @qarlosalberto @smgl9 @Nic30 @pepijndevos @drom @cavearr @juanmard @olofk @gojimmypi @ghuntley @mickaelistria @trabucayre @Paebbels
A tweet from @carlosedp (https://twitter.com/carlosedp/status/1432708143067377665) triggered some interesting dialogues about IDEs and supporting easy integration with EDA tool(chains).
This is a compendium of ideas as a result of those dialogues, in the hope that we can find synergies and slowly bring isolated projects together.
Introduction to the Electronic Design Automation (EDA) industry
Some participants are familiar with containers and IDEs, but not with the specifics of EDA tooling. Let this be a rough introduction:
Hardware is inherently parallel. Unlike software (read von Neumann paradigm), there is no central unit orchestrating/serialising the execution. Some authors use the terms "computing in time" and "computing in space", to make it explicit that hardware designs have a physical 2D/3D layout, below the "program/application logic". As a result, there is an unavoidable friction between text based design workflows and GUI/block/diagram based workflows.
Hardware was designed almost exclusively by drawings until the late 80s, mid 90s. There was almost no distinction between ASICs (chips) and PCBs (boards) with regard to the user input. Then, so-called Hardware Description Languages (HDL) stepped in. HDLs are in fact software programming languages, with certain patterns that some specific compilers (named synthesis tools) can use for inferring (say guessing) the corresponding hardware blocks. In parallel to HDLs, devices named FPGAs were invented. FPGAs sit between an Application Specific Integrated Circuit (ASIC) and a CPU (software programmable). FPGAs are configurable (programmable at the hardware level) and also programmable at the software level.
NOTE: Verilog is based on C. VHDL is based on Ada. The fact that both VHDL and (System) Verilog can be compiled to executable binaries through LLVM/GCC as if they were C/Ada has further implications, but let's keep those aside.
At first, FPGAs were used for prototyping ASICs only. However, nowadays FPGAs are found in products, as much as CPUs or ASICs. So, HDLs are "the programming languages of FPGAs" and there is a whole industry of "module" designers in the "IP core" subset of the EDA market. As a result, modern workflows for designing chips are becoming closer to mainstream software development practices, almost overlapping. However, the EDA industry is ~20 years behind the software industry, and the hardware/software developer ratio is somewhere around 1/10. There is so much to learn, so much work to do, but few human resources (in comparison to software projects). This GitHub organisation and the multiple projects mentioned below are distributed efforts to bring the "hardware designer" experience closer to "software development".
There are four main challenges for building an open source EDA IDE:
Lack of maturity of most projects. Open Source started stepping into hardware in the early-mid 2000s, it started to really kick-off in the early 2010s, and it's rocketing to the stars in the last 5 years. So many monoliths were built in the last decade. Now it's time to polish them, take the best, package them, and reuse them in integrated solutions. That's more of a communication / community building challenge, than a technical one. The knowledge is there, we need to be able to put it together.
Dependency on GUI tools. HDL designers need waveform visualisation for debugging. ASIC designers need 2D/3D layout viewers. PCB design is almost exclusively GUI based. Unlike software development, visualising graphs/diagrams is necesary for hardware development, not optional.
Size of the tools. While most simulation / early development / verification tools are very lightweight, implementation is astonishingly heavy. Vendor IDEs require 10-50GB. Part of that is useless stuff, but a significant portion corresponds to the databases of the FPGA devices or the PDK in case of ASICs.
Connection to the boards. Similarly to writing firmware for embedded microcontrollers, testing FPGA designs requires programming the boards at some point.
Similarly to "IP cores" at the FPGA/HDL level, reusable modules exist at the ASIC level. Those are named "chiplets". In fact, Google is pushing hard towards an open source hands-free HDL to chiplet solution. Tim @mithro is the visible lead of those efforts, under the umbrella of SymbiFlow, and in tight collaboration with Antmicro, efabless, SkyWater, etc. Some of the repositories in this organisation are precisely maintained by people from either Google or Antmicro (see bazel and conda repos).
Packaging and distribution
In the context of this issue, we will set the scope to all-in-one containers, as the ones maintained in this repository.
That is, a single container image containing all the tools/dependencies expected to be used by an IDE.
That is not desirable for CI purposes (that's why fine-grained images are provided as well), however, it's the most reasonable for newcomers, students, etc.
We don't care about how are the container images built, as long as the EDA tools are available in the PATH. It's ok to use the all-in-one containers from this repo, or any other. In other words, we will not address the lack of maturity with regard to how are tools built and installed. That's to be discussed in hdl/packages.
Local execution
x11docker/runx
When containers are executed locally, either through docker or podman, x11docker and runx can be used for starting an X server and sharing it with the container.
That allows running GUI tools without installing any additional software in the container images.
That is showcased in the UG of this repo: https://hdl.github.io/containers/ug/index.html#_tools_with_gui.
x11docker
works on GNU/Linux hosts, and also on Windows (throughrunx
).GTK3 Broadway
GTK3 has a built-in HTML "frontend" named Broadway (see https://developer.gnome.org/gtk3/stable/gtk-broadway.html). Hence, tools which use the GTK3 toolkit can be accessed through a web browser. That does not require any modification to the container images, apart from starting the broadway server (see https://github.com/ghdl/docker/blob/master/broadway.sh).
See, for instance, GTKWave in image
ghdl/ext:broadway
.IIRC the future of Broadway in GTK4 is uncertain.
VSCode Docker extension
Microsoft's VSCode's Docker Extension provides some interesting features for starting containers (optionally using x11docker): microsoft/vscode-docker#1496. It's been a while since I last had time to tinker with that. Nevertheless, I've always had very good communication and feedback from the maintainers of the extension. I was very gratefully surprised by how responsive and proactive they were.
VSCode and Remote - Containers or Remote - WSL
I don't find the Remote - Containers and Remote - WSL appealing, mostly because AFAIAA they contain non open source pieces. On the one hand, the HyperV version provides easier resource management. On the other hand, by configuring the Docker Extension tasks, I don't find need to start VSCode instances inside the container.
Nevertheless, I do sometimes use the Remote - Containers feature on Windows. Since Microsoft refuses to support MSYS2 in VSCode, all git features are broken. Running VSCode attached to a container fixes that, because it's on a GNU/Linux environment, so git works. However, credentials need to be shared, which is cumbersome.
Overall, Remote - Containers and Remote - WSL are the most similar local experience to the solution provided by Gitpod or Eclipse Che.
USB boards
As explained in https://hdl.github.io/containers/ug/index.html#_usbip_protocol_support_for_docker_desktop, programming boards through USB from containers is not straightforward. On GNU/Linux, the devices can be shared; however, on Windows containers run on a Virtual Machine which does not support access to arbitrary USB devices. USB/IP allows working around that limitation by using a TCP/IP port. Yet, we are lacking an open source implementation of USB/IP on Windows.
Extensions
There are multiple extensions for VSCode related to HDL|EDA already: https://github.com/dbhi/vsc-hdl#references.
@ghuntley mentioned an open source VSIX extension marketplace gifted to the Eclipse Foundation by Gitpod/TypeFox (https://twitter.com/GeoffreyHuntley/status/1433035049524346887): https://open-vsx.org/. It would be interesting to have HDL|EDA related extensions published there.
Probably the most complete open source extension is TerosHDL. TerosHDL wraps some of the most used VHDL and Verilog projects, such as cocotb, VUnit, GHDL, wavedrom, VHDL formatter, Yosys... It is sensible to join forces around TerosHDL. As a matter of fact, a previous version of TerosHDL was based on Atom, instead of VSCode (see https://www.youtube.com/watch?v=_wxTjOSO5oY). apio-ide was a similar approach, unfortunately unmaintained before being mature.
Sigasi does have several features similar to TerosHDL and it is now provided as a VSCode extension as well. However, it's not open source.
Diagrams
GHDL and/or Yosys allow generating diagrams of synthesisable HDL sources (see https://umarcor.github.io/osvb/apis/project/DocGen.html#diagrams). In fact, TerosHDL uses some of those features already.
It also allows generating dependency tree diagrams.
Sigasi provides a similar solution in the XPRT tier: https://insights.sigasi.com/manual/views/#block-diagram-view. Yet, that is not open source.
On the other hand, there is no mature project for web technology based block-diagram input yet, but there are some interesting proofs of concept:
It would be interesting to evaluate whether any of those can be used as a VSCode extension.
There are some other nice projects not based on web technologies: https://umarcor.github.io/hwstudio/doc/#_references.
I'm not experienced on the analog and ASIC side of "drawings". I know @pepijndevos is working on NyanCAD, which uses web technologies as a frontend (see NyanCAD/Mosaic). I believe that Mosaic might be provided as a VSIX extension.
Waveforms
The most used web technology based waveform viewer is probably wavedrom (see also zoom and vcdrom). In fact, there is a VSCode extension: https://marketplace.visualstudio.com/items?itemName=bmpenuelas.waveform-render. See also vscode-vcdrom and vcd-samples.
However, there are other solutions:
impulse.vscode is particularly interesting, but it's not open source.
A common limitation of most waveform visualisation projects based on web technologies is that waveform formats other than VCD are typically not supported. As explained in https://umarcor.github.io/osvb/apis/logging.html#waveforms, VCD cannot handle certain types from the VHDL language.
Gitpod
Gitpod can be particularly useful in learning/teaching environments, as it allows easy to setup workspaces which are ready-to-use by unexperienced users (see https://www.gitpod.io/blog/workshops-as-code). It is mostly based on creating a container, having a repository cloned in it and attaching a web editor using VSCode or Theia.
Since everything is executed remotely, using GUI tools is apparently limited. The most straightforward solution seems to be gitpod/workspace-full-vnc, which is based on gitpod/workspace-full, which is based on gitpod/workspace-base... That is used in cocotb and umarcor/SIEAV.
On the one hand, @ghuntley suggested not to use the
gitpod/workspace*
images, but to build our own (see https://twitter.com/GeoffreyHuntley/status/1433083202705387525). That is sensible indeed. However, since there is so much content in the workspace-base, workspace-full and workspace-full-vnc images/layers, it's not straightforward to guess which are actual dependencies and which optional/convenient features. I'm thinking about Apache2, NGINX, PHP, Homebrew, Go, etc. In Custom Docker Image, steps to customise an image are explained, but there is no explicit reference to dependencies. Are there no dependencies at all? That is, is Gitpod expected to work on adebian:bullseye-slim
(as the VSCode attach feature)?On the other hand, in order to use GUI tools, alternatives to VNC might be evaluated. Precisely, x11docker provides several solutions, either local or remote. Among those, xpra might be the most interesting to test in Gitpod. By running the xpra client on the host, users can have "native" windows, while still using the in-browser and remote VSCode/Theia frontend of Gitpod.
It would be interesting to test GTK's broadway as well. However, that's more of a fancy test, rather than usable for most HDL/EDA tools.
Overall, in the context of this repository, it would be interesting to have a
debian-bullseye/gitpod.dockerfile
, which we could use on top of any of the existing images in collectiondebian/bullseye
. That should be relatively straightforward, once we clarify the minimal requirements and the best approach for a good user experience when using GTKWave, KLayout, etc.Eclipse Che
When I tried Che some years ago, I think it was still independent from Theia. Compared to Gitpod and in analogy to docker, Che felt like creating stacks of containers through compose, while Gitpod is for running a single container. It felt to me that the strength of Che was user management, groups, permissions, access to shared projects, etc. In fact, it is now advertised as "The Kubernetes-Native IDE for Developer Teams". portainer.io provides some overlapping features in that regard. As a result, a Che instance is not lightweight and it requires some good knowledge in order to deploy and (mostly) maintain. While I see the value within organisations/companies, it was/is out of my scope because I don't have the infrastructure for running it as a service.
On the other hand, the architecture of Che feels to be a better fit for using fine-grained containers. Instead of having all the tools/dependencies in a single image, Che allows/expects the container running the editor to be different from the ones doing the heavy work. That fits nicely with the conception of a toolkit/API than can execute commands either locally or in sibling containers. Therefore, in the mid term it might make sense to find synergies between Che and the distributed runner aspect of https://umarcor.github.io/osvb/apis/tool.html.
OpenAPI
There is work-in-progress in edaa-org for defining a set of abstraction models, which might then be wrapped in an OpenAPI (see https://umarcor.github.io/hwstudio/doc/#_structure). That would allow reusing the tool management backends regardless of the frontend. As a proof of concept, https://github.com/umarcor/hwstudio showcases how to host a web frontend in GitHub Pages, and use it with a Python backend executed locally.
In fact, having an API that allows a remote frontend to communicate with a local backend is a requirement in order to program the boards. Gitpod and/or Che can be used for simulation, synthesis, implementation, etc. but they cannot program the bitstreams into the boards by themselves, because they are connected to the user's computer. There are three possible solutions:
Note that some vendors (such as Xilinx) provide a "Hardware manager" (named Vivado Lab Solutions) as a "compact, and standalone" package that users can install on their workstations with multiple FPGA boards to use them remotely. Overall, we should head towards providing an equivalent solution for open source boards/workflows, probably based on openFPGALoader.
As soon as we can support programming boards from Gitpod/Che, we should provide those options in fomu-workshop.
The text was updated successfully, but these errors were encountered: