top of page

Data Acquisition and Validation (DAQ) Lead

I switched to the position of Data Acquisition and Validation (DAQ) Lead after being Chief Engineer in January 2019, I held this position until graduation in May 2020. I co-led this subsystem with another astute student and friend; she worked primarily on post-processing code, I handled hardware, deployed-code and testing. During my tenure as DAQ Lead, I helped facilitate explosive growth of the subsystem into a team of as many as 10 people, with ever better equipment, training and aspirations. Never has the team had such a wide and configurable array of sensing options for performing validation experiments. Though much of the work done in this role relates to electronics or programming, a developed understanding of engineering design and physics is essential to motivating development and interpreting data.

​

Core skill areas/tools in this role:

  • Electronic circuit design (analog and digital)

  • Assembling, testing and troubleshooting prototype sensor systems

  • Calibrating, error profiling and other characterisation of sensors

  • Programming microcontrollers (MCUs) and FPGAs (C/C++ and LabVIEW)

  • Writing data processing programs in MATLAB

  • Assembling, soldering electronics

  • Designing, CAD modelling, producing mounting, casing and other ancillary hardware

  • Researching programming, electrical engineering literature

  • Design of experiments

  • Designing and implementing documentation template and strategies

  • Team leadership

  • Technical project planning

  • Running tests, data collection excursions

​

As DAQ Lead, typical work involved:

​

  • Research, testing and integration of sensors into new and existing hardware platforms: Think breadboarding, working with scopes and other bench-top equipment, and writing experimental MCU code to create proof of concept setups. Typically, this involves gauging error, ensuring adequately high Nyquist rates are attainable, and building conditioning/supporting circuitry.

  • Developing test procedures: I worked with all the other subsystem leads to strategise about what data are attainable, which new datasets should be the motivation of novel DAQ developments, and how specific experimentation procedures can be used to identify useful data or design parameters. For example, it was desired to determine the wheel slip angles (the angle difference between where the wheels ‘point’ and the tangent line at the actual turning radius) over a wide range of driving and ground conditions. I developed a system for measuring this by creating the hardware for using a pair of analog accelerometers (with 14kS/s rates for integration low drift) attached to the frame to measure the rotational rates of the frame as treated as a rigid body (from which radius turning radius can be extracted). Then, I combined this with a system for attaining rack travel with a linear displacement sensor, so that during controlled maneuvering,  slip angles could be calculated from the combined datasets.

  • Calibrating, error characterizing and environmental testing of sensors: I used trusty bench-top equipment and campus labs widely to calibrate sensors and identify their error profiles when used in an integrated system. Sometimes these were rather involved, for example in order to validate supposed improvements to damping at the engine mount, vibrations of the engine needed to be measured. This sort of measurement of high speed, low amplitude vibration measurement is typically the domain of laser doppler vibrometers or ultra-precision accelerometers, but I wanted to determine if a simple $15 MEMS accelerometer could do the job. I devised an experiment to test this using beam theory validation equipment (to precisely vibrate the accelerometer) with a LDV in a mechanical engineering lab, and was able to obtain a transfer function for the cheap accelerometers from Bode data. This way, we managed to complete the mount validations without learning to use new equipment and zero costly investments.

  • Informing designs to lend themselves to convenient validation: I would collaborate with the design test to modify designs as necessary, usually in minimal ways like addition of reference surfaces or tabs, so that data collection for validating things like loads on the part is easier or possible. 

  • Learning, learning, learning: I taught myself a raft of skills as DAQ lead, from MCU programming, to analog signal conditioning circuit design, to stochastic analysis methods. Whilst we were always cautious to avoid ‘reinventing the wheel’ with regards to our DAQ endeavors, extensive self-teaching and applications ideation on my part was necessary to even establish capabilities in the conventional methods of collecting data with microcontrollers.

​

Two significant projects dominated my tenure as DAQ lead, they were:

​

  • Master DAQ platform: I led the development of two iterations of a platform to breakout a Teensy 3.6 MCU in a physically robust package with built in conditioning circuitry, non-volatile memory, battery power supply and multiple (UART, I2C, even RF!) communications options. This device mounts to the car and provides excess reconfigurable I/O options for hooking up the sensors we use right now: IMUs, analog displacement sensors, digital HE WSSs, strain gauges/FTs, pressure transducers and switches, with capabilities for much more. In addition to having ideal, customised electronics, ports and containment, classes for each type of sensor were written for effortless programing. Now, custom DAQ experiments are well within the abilities of even the team’s most junior engineers. This device has been used to collect hundreds of millions of data points, all helping to inform design decisions through validation and input case generation. There were innumerable challenges involved in perfectly its operation, especially on the software side. For example, the canned functions for writing to the SD card are not optimized to write un-rich, exclusively byte data at high rates (15kS/s) since these functions are generic. So, I wrote a system that uses only primitive functions like bit shifting and working directly with registers to decompose high precision data into a byte stream that is written to memory very quickly and can then be reassembled with a custom MATLAB script. The two iterations of the platform, the second an improvement on the first, added additional MCU hardware for eCVT (electronic continuously variable transmission) control, more I/O and the team's first two-way RF comms system. 

  • Automated shock dynamometer: I designed, built and tested on a dynamometer for characterising shock absorber parameters from scratch. For just the cost of two outsourced dynamometer tests ($1100), I created this ultra-adaptive device with adjustable hardware and a simplistic GUI programmed in LabVIEW. I conceived it to be a machine for serving the team’s shock tuning needs for years into the future, which is why I had to design it to accommodate a wide range of shock geometries and stiffnesses, and also to be usable with no programming experience. By making measurements the of reaction force on the shock (with a FT) and shock displacement (with a LDS), both in time, the system stores and reports shock stiffness, reaction force and damping in real time. This project was completed in a single semester and entailed challenges around programming in LabVIEW, electrical power control and design of a physically robust device.

​

Whilst I didn't work on it directly much myself, as DAQ Lead I led the subsystem to co-create the team's first ever eCVT that will be run at upcoming competitions. This project involved closed loop control optimisation, high speed sensing challenges and the meticulous design of fail-safe MCU code. A short video of the eCVT in testing is shown below the gallery.

DAQ Lead: Text

DAQ Gallery

DAQ Lead: Gallery

eCVT in early testing

DAQ Lead: Video
bottom of page