The hardware/software distinction is fast becoming a moot point. Like it's only worth splitting hairs over if the hardware is an analog synth, and even then, you could consider the method of wiring/electrical signal as the "software" of the synth. Depending on how stoned you are, you can take "hardware" and "software" to mean anything these days, so the first thing in this discussion that should happen is a discussion on what we all define hardware and software to be.
I like to think of hardware as kind of creating medium for a particular task to be completed in, and software as an interface between the medium and the operator. So (definitions of "interface" notwithstanding), in the case of an analog synth, the hardware creates the medium of electrical signals for sound to be parsed from, and the software is the keys, knobs, etc. This isn't a rigid definition, nor is it it all-inclusive...it's just something that works for me in most cases. I don't need it to be rigorous, really.
But what it does is allow the discussion to stop being a simple hardware/software dichotomy and start being a discussion on the interactions between the hardware and software, and then on the interactions between the operator and the tool. Ultimately all any of this is doing is allowing us to take an internal thought or idea and externalize it...the value in really expensive tools is the ease in which it allows us to externalize something, which is one way to think about operator/tool interactions. At the same time, a tool designed in an unusual way allows us to conceive of entirely new interactions that we haven't really thought of before, which is another kind of interaction.
Most software tools fall in the first camp -- that of easing an externalization. If you think about it, that's kind of the point; software is supposed to "understand" the hardware really well so it can interpret a user's actions and translate them to something comprehensible. So stuff like Ableton and LSDJ -- that stuff eases the externalization of our ideas because it makes the actual creation of sound easy enough for us to comfortably do. Hardware, on the other hand, is often concerned with the second camp -- creating novel interactions that compel us to think about the software differently. This is also kind of intuitive -- changes in medium obviously create way different sounds; a string instrument is going to create a completely different sound than an electrical signal, etc.
The REALLY interesting stuff, however, happens when the hardware and software become pretty intertwined, such as what happened with LSDJ/Nanoloop and the Gameboy (see, I managed to bring it back to chiptune). I honestly can't imagine seriously composing an LSDJ song in an emulator because it just makes SENSE to use LSDJ on a Gameboy (this is just me, obviously other people do things differently, it's just a talking point). At the same time, what happens if LSDJ gets mouse control? To me, it loses its compelling feature (one of the only interfaces between Gameboy sound chip and user) when it starts becoming less exclusive...that's my theory on why LSDJ is kind of the de-facto DMG tool.
So I guess in the end I'm not sure where I'm going with this >_<
EDIT: I'm really sorry if this post kills this thread because it's way too long
EDIT2: Man, long essays about nothing is what a digital arts minor gets you...all the digital arts MAJORS are out actually, like, making shit
Last edited by spacetownsavior (Aug 9, 2012 4:53 am)