Forum Discussion
Altera_Forum
Honored Contributor
15 years agoI wonder if your question was answered. Seems like HDL has the momentum. A big reason is that it is a standard across the industry. Also, the tools are written by people who are not concerned about path delay and clock domains. Even when we used schematic entry, the placement and routing used a sort of HDL to map from logic block to physical block. Our logic diagrams required placement and routing for printing as well as placement and routing for wiring. Text input is the natural for program data entry and the net list view does show a schematic. HDL does work, just not too efficiently.
I have a simulator prototype attached along with a short description of the input syntax and a test case that initializes a couple of registers and moves some bit fields from one to the other. The control logic input is Boolean rather than if/else. The register assignments controlled by each clock(domain) follow the clock definition in the text input. The .exe is in the .zip and the syntax description and test case example are in the .txt. Enter the electronic design world. I've designed (mostly digital) circuits in the eighties, using schematic entry. Then I return to digital design these days and find that schematic entry is frowned upon, and using VHDL/Verilog is deemed "more professional". There are some religious issues, converts would say "more professional". Can someone explain to me why schematic entry (which, from a pure programming standpoint could be considered a 4th generation entry-tool) is considered inferior to VHDL/Verilog (which, from a pure programming standpoint, could be considered 3rd generation entry-tools)? I've looked at VHDL/Verilog, and even though they work for me, I find that they make it more difficult for me to envision (gate)delay characteristics and clock domains than when I'd use schematic entry/building blocks. Schematic entry supports a fairly straightforward left-to-right/top-to-bottom processing workflow; VHDL/Verilog however tend to result in spaghetti-like codeflows (because of the parallel nature of the hardware description), which makes design errors more likely... That is because the hardware is parallel, and the objective of HDL is to use sequential if's to define parallel and gates. In a schematic you can follow a clock line and see everything it triggers. In hdl you have to find all the always blocks where the clock appears in the text and there has to be an always block for practically every register. Why would anyone want to look at a block symbol shape, see 4 inputs, and immediately realiz that 4 signals are being and'd. It is more of a challenge to count the if statements. Unless, someone is able to envision the schematic equivalent in his head while kranking out VHDL, but then using VHDL is more of a nuisance than an advantage. Remember also that synthesis is in there anxious to throw away the logic that you don't have totally connected so that it can see that it drives an output pin. And it may or may not generate the logic that you wanted. And placement and routing also waste compile time by running while the design is not logically complete. I have been called an idiot for complaining about synthesis throwing logic away that did not drive an output pin because all I was doing was wasting precious power. So I cannot put in incomplete logic to get a resource count for sizing purposes. VHDL has its place for complicated table driven logic which isn't easily described in functional blocks, but other than that, it makes it more difficult to envision the hardware equivalent than using schematic entry and thus hinders productivity. Could anyone more proficient than I am in VHDL/Verilog, comment on my assertions above? --- Quote End ---