Site Search:

Ready-to-run PC-based Data Acquisition Software

You can take the programming out of PC-based data acquisition and still enjoy all the advantages offered by computer technology. Just make sure the ready-to-run PC-based data acquisition software you choose has the features you need.

There's a common misconception regarding personal computers, data acquisition, and programming. It seems that most current and future users of PC-based data acquisition instrumentation believe that the three come as a set — that before any real measurements can be made with a PC and a data acquisition product, you have to write a program. Don't believe me? Imagine that five hours before quitting time on the last day of the month, the production line crashes. Production control is yelling at maintenance. Maintenance is yelling at the manufacturer. Amid this chaos, your boss has two words for you: "Fix it." Under this pressure, what instrument do you take? A standalone box like an oscilloscope or chart recorder, or your PC and your favorite data acquisition solution? Face it. You'll choose the standalone box every time. You're a victim of the "Have PC, Must Program" culture that relegates PC-based data acquisition to second-class status in a five-alarm emergency. But if you're a maintenance tech or troubleshooter, most of your "projects" are five-alarm emergencies and you're spending a lot of time with older instrumentation. As a consequence, you're not benefiting from PC-based instrumentation in the most crucial situations where speed and flexibility define the difference between success and failure.

Why don't you choose PC-based alternatives when the heat is on? The reason lies in the elegance of the standalone instrument: It doesn't have a steep learning curve because it doesn't require programming. It simply makes the measurement. Another reason is that you can imagine your boss's reaction if you said you'll get right on it, but it will take a few hours to write and debug the program. Then again, what program would you write? Do you have an intermittent problem that requires triggering, or do you have to record continuously? How many channels do you need to acquire and at what sample rate? In a troubleshooting situation with an unknown fault, the list of potential measurement protocols is almost endless. If you believe that a measurement requires a program written by you, you'll wisely chuck the PC in favor of the standalone instrument every time.

But can you avoid programming for maintenance and troubleshooting measurements with a PC? In so doing, can a PC assume the measurement elegance and simplicity of a standalone alternative while maintaining the advantages promised by a PC-based record? The answer to both questions is "yes" when you consider a preprogrammed, ready-to-run software package. What that package should do and how it should work is the focus of this article.

Table 1 is a checklist of the software features you'll need to put a PC to immediate use in your next task. You can treat it as a shopping list and apply it with confidence when comparing alternatives from multiple vendors because it has evolved and been time-tested over many years by users like you in critical situations like yours. Beginning with data acquisition features, we'll look at each item and examine its importance to any given application.

Mode Feature Comments
Data Acquisition Real time display Wide frequency range, triggerable, multiple channels, never fall behind, time compression support
Sample rate Wide range, intelligent over-sampling support, sample at different rates per channel
Triggered storage Triggering off a waveform with pre and post selections, trigger and stream, manual
Event markers, time and date stamping Enable markers synchronously and asynchronously with or without comments, time and date stamp all markers and acquired data
Playback and Review Multitasking Review current and historical information at the same time
Speed Waveform graphics must be faster than your ability to interpret
Waveform compression Speeds waveform interpretation of longer files
Quick look analysis Quantify graphical waveform information
Export Bring waveform data and graphics into your favorite applications for further study and presentation
General Point-and-click user interface All data acquisition and analysis features should be easily accessible allowing instant data acquisition, playback, and analysis

Table 1 - These are the key features you should look for when selecting a turnkey data acquisition and analysis product. See text for detailed discussions of each.

Real Time Display

Absolutely no instrumentation feature is as important as the real time display. It's your focal point for all procedures, and it defines the sequence of measurement events you will apply as you work your way toward a solution. So its surprising that many PC-based solutions treat it as an afterthought. Make sure the display of the product you're considering has these all-important attributes:

  • It must operate over a wide frequency range, since maintenance and troubleshooting activities generally carry an element of unpredictability.
  • It must provide triggering capability so periodic events are displayed in a stable manner.
  • It must support multiple channels as required.
  • Without exception, it must never fall behind (i.e., lose it's real time attribute) even at the highest sampling frequencies and while data storage is active. This is absolutely critical in tuning applications where visual feedback needs to be accurate and instantaneous.
  • It must provide the ability to time-compress. This feature allows longer term trending of high as well as low frequency signals.

If you use a traditional instrument successfully, it probably has all the above features. When you apply a PC to the measurement task you should expect nothing less.

Sample Rate (Frequency Response)

The subject of sample rate may on the surface seem mundane and straightforward. Everyone knows that you select a range that is consistent with the frequencies you need to measure. However, there are two advanced features you should look for that will enhance your measurement flexibility in a variety of situations: Selectable sample rates per channel and intelligent over-sampling.

The ability to select different sample rates per channel is very handy in situations where you measure high and low frequency signals simultaneously. Without it, you're forced to sample low frequency signals at sometimes ridiculously high rates that do little more than consume lots of disk space when you record.

Intelligent over-sampling (IOS) can pay handsome dividends in situations where you need to extract information from a high frequency waveform at a low rate. This sounds paradoxical, so I'll explain using an example. One of our customers suspected that a 400Hz aviation power supply's output was intermittently sagging and surging (see Figure 1). The event occurred once or twice in a 24-hour period, and his plan was to continuously record its output over at least one full day. Using his oscilloscope was out of the question given the long-term recording nature of the test. He could have purchased an rms amplifier for use with his older chart recorder, but then he'd have to sort through many feet of chart paper to determine whether he had a fault, and estimate its occurrence in time by counting grid divisions. Instead, he used one of our data acquisition systems that supports IOS. He set up a channel to sample at 10,000 samples per second, and instructed the software to request information once per second. For every one sample requested by the software, the hardware acquired 10,000. A system without IOS simply reports the 10,000th value and ignores the preceding 9,999. But IOS allows you to group all 10,000 samples by reporting their average, minimum, or maximum value. Our customer chose the maximum reporting mode and the result was a trend plot at one sample per second that precisely defined the peak excursions of the 400 Hz power waveform.

Data Acquisition Waveforms - Sample Rate Illustration
Figure 1 — In the background represents the output of a 400 Hz aviation power supply acquired at 10,000 samples per second. In the foreground is the same output acquired at 1 Hz using Intelligent Over-sampling (IOS). The trend plot of the IOS screen exactly matches the peak values of the 400 Hz waveform, but at one ten thousandth of the sample rate.

Triggered Storage Modes

An absolute must to complement any type of waveform recording system is a good triggering algorithm with pre-and post-trigger capability. This feature allows you to control what data you acquire, and when. It is invaluable for troubleshooting intermittents with the pre-and post-trigger features clearly defining cause and effect. Choose a system with large trigger buffers (on the order of 8,000 samples).

Another trigger mode called trigger-and-stream is useful when you need to acquire a large amount of data following a trigger. Instead of allowing the trigger buffer size to cramp you, trigger-and-stream allows you to stream data to disk following a trigger. For example, we have a customer in the Netherlands who develops wind-powered turbines used for power generation. It can be days before wind speed exceeds the threshold needed to move the turbine. When it does, our system is triggered and remains in an active storage state for 30 minutes at a high sample rate so the engineers can monitor and record startup transients and other events.

Exotic trigger modes aside, in many applications there aren't substitutes for the simplicity of pushing a button to stop and start data recording. Drop from consideration any system that doesn't allow you to stream data to disk on a continuous basis and over the full sample rate spectrum of the product. You'll find a number of packages that don't make the grade. But disk streaming is the bread and butter of most recording situations. You can't afford a system that runs out of gas before you're finished with the measurement.

Event Marking, Commenting, and Stamping

Wouldn't it be nice to be able to record a written diary of a test and have it become part of the data record along with time and date information? Most would agree, but you rarely think about these features until you need them.

An event mark is a marker intentionally placed in a file that denotes a point of interest. It may be enabled synchronously, like when a trigger occurs, or asynchronously via a manual keyboard command. In many instances they may be commented to allow you to describe exactly what was going on at that instant in time so you can better interpret your results tomorrow, or a year from now. One recorded test can contain hundreds or thousands of event markers, each with its own unique comment, and date and time of activation. Together, they form a written account of what you did during the test, why, and when.

As important as these data acquisition features are, they represent only half the solution. Just as critical is a way to review acquired data for interpretation and problem solving, either on site or off. Therefore, waveform playback and review software is a necessary complement to any data acquisition program. Treat it as an afterthought, as so many do, and you risk an inordinate and unnecessary degree of frustration that will lead to discouragement and lack of confidence in any PC-based product. The reason is that most data acquisition applications generate a lot of data, enough to overload most conventional analysis software like Microsoft Excel, which is nearly always an inexperienced user's first choice. Attaching some numbers, Excel goes into overload at about 64,000 samples, which essentially locks you out of any convenient approach to waveform review in moderate to high-speed sample rate situations.

If the data acquisition software supports disk streaming, then you will create large data files by definition. As such the playback software must be able to work with disk-based files of any size, easily streaming data from the disk to the display in a forward or backward direction as you review data looking ahead of or behind any random point.

In addition to basic waveform review capabilities, there are other features you should look for that will enhance your ability to interpret recorded data:

Multitasking

You may not have thought about it, but a chart recorder offers extremely elegant multitasking ability. As the paper moves through the machine, you can view the real time point and instantly determine the current status of your measurement. Or you can scan down the paper to determine what happened a few minutes or a few hours ago.

Any viable PC-based solution must preserve this ability to instantly access current or historical information. The best approach allows multitasking where a data acquisition task that is writing data to a file allows the playback task to access the same file at the same time. In this manner, two windows are available on your screen. One gives you access to previously acquired data, and another keeps you informed of real time activity. This feature is especially valuable in longer-term tests. Without it you're forced to wait for the measurement to end before you can evaluate results and make decisions.

Speed

All the features in the world won't compensation for sluggish waveform graphics. You'll generally acquire large data files and have a pressing need to review them quickly and efficiently. Don't settle for any solution where the waveform graphics are slower than your ability to interpret them.

Waveform Compression

We had a customer who continuously recorded data from an oven for two weeks to characterize its temperature extremes and excursions. The goal was to correlate temperature changes with variations in product quality. Since the acquisition rate was once per second and she used a monitor with a horizontal resolution of 800 pixels, the largest time span she could view during playback was 800 seconds, or less than 14 minutes. Viewing 14 out of a total of over 20,000 recorded minutes at a time is like trying to watch a baseball game through a straw. It's real easy to miss the big picture.

Waveform compression (see Figure 2) solves this problem by allowing you to squeeze more data on the screen through use of compression algorithms that ensure that fast, transient excursions are preserved. With it, our customer was able to compress the entire two weeks recording onto one screen for a bird's eye view of the entire record.

Data Acquisition Waveform Compression
Figure 2 — The background panel represents about a 14-minute display out of a 20,000-minute (two-week) recording of oven temperature. The middle panel shows the same two-week recording compressed into one screen width to reveal obvious temperature variations. In the foreground is a statistical evaluation of the two-week recording (over 1.2 million samples) to quantify mean, maximum, and minimum temperatures, with a number of other quantifiers.

Quick Look Analysis

Viewing a graphical trend is valuable, but at some time it's usually necessary to quantify the results. This is where quick look analysis comes in. With it, you can choose any span of waveform data (compressed or not) and let the computer report a number of statistical measures such as minimum, maximum, mean, median and many more (see Figure 2). Other handy forms of quick look analysis are X-Y plotting, and frequency domain analysis like FFT and DFT.

Data and Graphical Export

Even with quick look analysis at your disposal, you'll probably reach a point where you'd like to export all, or any portion of the waveform data or graphics to another package for further analysis or presentation. Ensure that the packages you consider can export data in a format that is compatible with your favorite analysis software (Microsoft Excel is probably the most popular). The ability to paste waveform graphics directly into a report from your word processor provides yet another outlet for using PC-based data to its fullest potential.

Conclusions

It's important to emphasize that all the features I've discussed (and many I haven't) are a simple mouse click away in the typical ready-to-run software package. From the instant the application is opened, it's acquiring and displaying data. Your only task is to configure a specific measurement protocol by accessing dialog boxes from pull-down menus. In the spirit of Windows' common user interface, if you can use a word processor, you can acquire data.

Further, your decision to program or use a pre-programmed approach need not be mutually exclusive. There are many applications where a programming language like Visual BASIC, or object language like HP VEE is more appropriate. Examples are any repetitive application where the measurement is specific and well defined. Even less defined applications are candidates assuming you have the time to write and debug a program. At least as common, though, are applications with unknown measurement requirements and where time is of the essence. You'll find them especially in the maintenance and troubleshooting applications we've discussed. You'll also find them in design qualification, quality, process monitoring, and a variety of other situations. This is the domain of the pre-programmed solution.

Learning to recognize the difference between applications that require a programmed approach and those that don't will allow you to apply PC-based instrumentation in a more productive manner. You'll rely less on bulky traditional instruments, and you'll spend less time programming. In the process, you'll be that much closer to the ideal of PC-based instrumentation. It isn't just for the bold and brave. It's for anyone who wants a better tool to make measurements and solve problems — whether they can program or not.