Lets you embed or extend Python using D instead of
D seems to have gained heavy interest judging from the
amount of traffic in digitalmars' newsgroups and I take
that as a sure sign that many people are very dissatisfied
with C++ and Java as a systems language (and in the case
of Java, a big 'WTF?!?' ought to be placed right after it).
I am still very much convinced that using C-inspired semantics make
for a very suboptimal systems language. D may be a better C/C++/Java
with modern features, but it's still burdened by C's ad-hoc-kiness
such as its ugly excuse for a type system (among other things). My belief
is that with the right systems language it should possible to build and
evolve an OS in much less time than what is happening with NT and Linux.
In the area of applications programming, scripting languages like
Python have already proven to be silver bullets (personal experience),
cutting down development times by orders of magnitude (well at least
by one order =D ). Now we would like to see something similar occur
for systems development.
We have applications written in so many different languages, but
the foundational code we use - drivers and OSes - are almost
without exception all still written in a 30-year old language and
expressed using 30-year old concepts. I can't help but feel that there
should be so many good ideas by now which can greatly enhance the language
that we use for systems programming.
"Operating systems are about blurring the boundary between programming
language and 'virtual machine' abstractions and real devices."
True, but remember that the operating system has to be written using
some language too. ]
Why no real alternative for so long? Could it be because there are precious
few people who are at the same time excellent language designers as
well as OS/kernel engineers?
What language concepts are best for /efficiently/ and at the same time
elegantly expressing the process of turning the movs, pops and cmps, and
jmps of assembly language into concurrency, IPC, I/O, and data structure
abstractions? These have got to be concepts which map well both to
the physical hardware as well as to human/mathematical intuition.
C excels at the former but is terrible at the latter. High and
very high level languages tend to cater for the latter at the expense
of the former.
I think it's high time a better compromise has become available.
Must we keep writing our virtual machines in C? Are we forever
constrained to thinking in terms of C semantics (i.e. a C library) when
talking to the hardware? I think that requirement may be holding us back
from a lot of potentially rapid progress.