?

Log in

 
 
20 July 2010 @ 11:46 am
LabVIEW  
---This message appears on each public post I make.  I do not plan on posting many public entries.  If you consider yourself interested in my writings here on LiveJournal, it is your responsibility to contact me and pursue friend status.  In the end, I am under no obligation to friend you.---

LabVIEW is one of the best and most professional pieces of software that exists, especially for any task that requires design or testing.  To summarize, it's basically visual, object oriented programming specifically tailored for external input/output.  Most of the programming you're taught in school will be all software level.  The furthest you may go is reading from or writing to a file, but everything you do will be concerning devices that are all attached to your motherboard, more or less.

Go the next level and you're programming a specific device.  You'll write up the program, download it via USB, serial, parallel, or even wirelessly to a device, like a remote control, and then that device runs the program.  Rather, the microprocessor in that device receives the program and can run it.  This is firmware.

There's no end all be all distinction between software and firmware.  Any way you look at it, at the root they're both written programs. Generally, the more able you are to change many elements of the code without rending another physical device, like a chip, useless, the more like software it is.  A better way might be saying that lower level programming is more suited to firmware and higher level programming is found more often in software.

Then there's hardware.  When you start talking about hardware, you're not in the computer world anymore.  Rather, you're not in the binary world, which is what we're taught with programming.  Everything's 1s and 0s, yes yes.  Even in software, that's not technically the case.  You have voltages that are most certainly not either 0 or, say, 5 volts.  Luckily, the way we've been able to design allows things to all function in, more or less, an on or off state.

Essentially what I'm saying is while in a computer we can treat everything as digital, even though it's still physically analog at the most basic, subatomic level.  When you get out of the computer, though, all bets are off, and you're dealing with full analog (even with integrated circuits, that function with boolean, or digital logic).

They don't teach you that in programming.  If you're ever writing firmware for a device, you get to write a program from scratch, download it onto a fully built board, run it, grab data somehow in ways you've specifically programmed, and go from there.  Now, the other fun thing to consider when you're messing with hardware over software is that hardware has this nasty tendency to not check itself if it's about to stand outside in a lightning storm with a metal rod.  If you fuck up the programming, unlike a system where you're just doing things software side, you can, and will physically damage devices or blow them up.  Small blow ups, but explosions nonetheless.

The testing is an arduous process, and building entire test circuits that won't be used in final designs can be annoying.  This is what I feel LabVIEW was essentially made for.

It's designed to have a visible, object oriented programming surface with many premade and configurable functions that are tailored to be used for input and output to other hardware via digital to analog conversion and vice versa.  Not only that, it can generate those signals and data as well as analyze both for you.  What we've got with LabVIEW is an all-in-one package that simulates power supplies, function generators, and provides a quick to setup programming interface for arranging this input and output.  Like all visual programming software, it has a visual control area for use while you're running the program, too.  Who needs a bunch of switches, dials, and potentiometers for manual attenuation of various signals being used as input when the program builds in such controls for all variables of a sine wave?

It's -extremely- convenient, especially when you consider that a lot of firmware is written in assembly.  Programming controls for all of these things would first require a microcontroller that could handle them and then lines and lines of code that, of course, has to be tested and debugged.

At it's root, though, it's still just convenience.

Convenience I need to use, considering I don't have much in the way of resistors, capacitors, integrated circuits, or a whole crapload of other things to do my own testing.

Which brings me to my point of this entire entry.

LabVIEW is easy to use.  Most things are point and click, drag and drop, type in some parameters, and you're good.  That's all when everything is setup, though.

I'm over here trying to configure it from scratch to even allow me to simulate a data acquisition device so I can play with that.  I mean, it's in the Getting Started guide.  I'm here, I have mt data acquisition block, I'm configuring it...I'm going to set it up to get an analog input....analog voltage......aaaand now I'm supposed to select a device to get it from.  Makes sense.  I don't have any devices configured.  Makes sense.

I don't have any physical devices, so how do I make a virtual one?  Okay, do this, do that ... ERROR.  Maybe do this, do-ERROR.

Aw shit.

The thing with LabVIEW is it's so widespread that it has a community of professionals that rivals the population of World of Warcraft, or at least seems to.  Anything and everything you need to know about LabVIEW, National Instruments has extremely detailed help, tutorials, videos, and even forum posts for it.

But there is so much freaking information out there that, while what you need is somewhere in this giant database, you're not guaranteed to find it unless you're already a master of LabVIEW and all the terminology it uses, which, by the way, is a ton.

Oh yes, I've searched for creating a virtual input (channel, rather).  I've seen countless help topics on how I might do this.  I read one, it links to another, I read that one, it links to another...so on and so forth.  Then I go back and select a new help topic, which links to another topic that I've already been linked to.

It's like opening up the Bible in Word, searching for "Jesus," and then trying to find what you need from the massive list of matches.  You know you don't need a match that also deals with God, but all those ones with Luke, Mary, Matthew, Paul, David, Romans, Hebrews - you have no idea what any of them actually mean, and they all allude to each other.   You can't just exclude them from the search, cause one of them might be part of what you're looking for, but chances are they're not.  I've been at this, off and on, for a few days.  Some good that Getting Started guide was!  I need to take a class or get a mentor.  This is almost something that you should be able to get a degree in - at least an associate's!

Basically, the whole thing I'm saying here is holy fuck, I'm confused.
 
 
Current Mood: confusedConfused
 
 
 
blunovablunova on July 23rd, 2010 04:25 am (UTC)
Hahaha, I love your Jesus/Bible analogy. I used to feel that way about the Instruction Manuals for our relays (each one is a 3+" binder). I've done a little bit of LabVIEW in the past (I played with it a little bit in undergrad for a project, and then some more in Grad School), but I doubt I ever got as in-depth as you're getting. I did really like it, though!

(Oh and BTW... your post totally sounded like a plug for NI! Hahaha!)
prismmoogleprismmoogle on July 23rd, 2010 04:28 am (UTC)
I just wanted to explain it for the nonengineers out there :(

I got out of it for the time being. I'm back to board design and updating a piece of our tool that apparently nine other engineers can't do. I got it simulating in a few hours with updated components.

Morons.