There’s an old saying in the engineering industry that simply states: ‘If it hasn’t been tested then it doesn’t work’ – or rather you should assume it doesn’t work. Most people however assume the opposite and that’s where the problems start because as we all know there’s another maxim which states ‘assumption is the mother of all mess*-ups’.
The degree of testing is of course the next decision. A simple design such as a torch can be tested by turning it on and off, to extend this test you can turn it on and off a lot of times to check the integrity of the principal component – the switch. This type of test is known as a functional test as you are testing its function. Other types of testing – known a structural testing breaks down the test process to check the smallest elements of the design. In our torch this could be the continuity of the conductors to the switch, the switch isolation and closed resistance, the voltage of the batteries and lamp (bulb) impedance etc..– a set of tests that will give a more precise diagnostic (than our functional test) if it fails on any aspect.
But testing’s not my problem!
Having worked within an established company in the world of PCB assembly testing for many years we still come across companies (OEMs) who will tell us ‘..we don’t need to test our boards, because we demand that the contract manufacturer (CEM) only sends us 100% working boards’. But of course demanding and actually receiving are two different matters. I would respectfully suggest that no-one could achieve this directive and even if they do, what does it actually cost? The OEM/design authority might neither know nor care their boards are being tested but of course they should. For example the CEM might only undertake a cursory functional test using some type of hot mock-up (system simulator) – i.e. when it powers up does it do roughly what it should? If the board fails this test then it is cast aside and another built to replace it until the order is complete. This could mean that depending on the CEM’s process quality and yield, any number of boards could be hitting the ‘bone-pile’ and guess who is ultimately paying for that? The OEM naturally. If boards can be diagnosed and then fixed at a cost of less than their value (component costs plus build costs (labour and factory overhead) and the manufacturing process debugged also then of course it makes sense to test more vigorously, but who actually decides that?
This is of course a joint decision that must be based on hard facts and knowledge of the manufacturing process. Both OEM and CEM must discuss Design for Test (DfT), fault coverage, diagnostics resolution (test system performance) and more, before jointly determining the optimum (lowest-cost effective) test path. Most good test systems will now provide a fault coverage assessment figure, but what does that mean, how is it derived and can I trust it? Independent fault coverage assessment systems are also available that can take inputs from a variety of test systems and aggregate the results to give you an overall test coverage. However these do still require a great deal of understanding and maintenance and generally suit large organisations with a dedicated DfT staff member.
Determining the test strategy
Once you have decided that it makes sense to take care of your own test requirements decisions must be made to define the optimum system. The first set of questions to be asked include:-
- How large is the biggest Unit Under Test (UUT) you need to test?
- Are my boards principally Analogue or Digital?
- Do my boards support JTAG/Boundary-scan?
- Are there any ‘unusual’ aspects’ to test (RF, High Voltage)?
- What volumes of board will be tested, are multiple systems needed?
- Do I in fact want to hand over the responsibility to a 3rd party?!
Very large boards may eliminate certain types of tester such as ‘flying probers’, similarly boards with a large test point count may require a vacuum or compressed air system to engage the UUT’s with the test pins. Meanwhile RF boards might require testing within a shielded (Faraday cage) enclosure to eliminate electromagnetic inflluences (EMC and EMI). And high voltage boards will require additional fail-safes and trip switched to isolate the user from dangerous power supplies and/or the UUT itself. In these cases your fixture supplier is your friend, they can advise on many aspects of test fixture/system design and also whether you can utilise a generic, re-configurable system.
The test fixture itself is of course simply the access mechanism to allow you to probe specific test points on the UUT with the required fidelity. On its own it is mostly useless and so it should be coupled to an optimised set of test equipment. You could try using the same test equipment that the developer used to debug his design and in some smaller businesses (SMEs) this is often tried. However most design tools are not made to withstand the rigours of production testing and importantly do not always support production test software to allow integration into a broader test executive. Pressing on with this type of ‘mix and match’ system can lead to a compromise system comprising multiple user interfaces and a confusing, possibly illogical, test procedure. With this in mind it might be preferable to decide first on your test executive and then select rugged instruments that can be controlled by it. You next decision will be to determine the ‘diagnostics resolution’ you require as this will affect the extent of the instrumentation needed. In our earlier torch example the simple functional test of turning it on gives a diagnostics resolution down to the entire UUT. However by deploying a digital multi-meter (DMM) you could measure the battery voltage, the switch open and close resistance the lamp impedance and even continuity of the interconnections. In this way diagnostic resolution is greatly improved and rather than discard the entire unit a repair/rework action can be taken.
For a typical basic (structural) ATE system such as a Manufacturing Defects Analyser (MDA) only a few simple measurement features are needed e.g. Voltage, Frequency, Timer and Continuity across a few hundred channels. Adding pattern generation/detection i.e. digital IO channels then allows you extend testing into the functional domain, and a further addition of JTAG/Boundary-scan (IEEE Std.1149.1) interfaces introduces possibilities for full device to device interconnection testing, memory cluster connection tests, logic tests, programming of devices (flash memory, CPLDs) and more. For an ultra-compact tester, all of the above features can be found in a JTAG Technologies JT 5705/FXT module. Indeed, the JT 5705 may, on its own, provide all you need for the required diagnostic resolution. If this is not the case it is fairly simple to select specialist instruments to augment this capability such as oscilloscopes, RF generators, power meters, timer counters, matrix switching etc.. Testers that combine functional and structural test aspects are commonly known as ‘combinational testers’.
FPGA (Field Programmable Gate Array) technology is now also helping to improve flexibility in test systems by allowing reconfigurable instruments to be embedded within the fabric of the device and are typically controlled by PXI or JTAG interfaces JTAG Tech’s JT 5705 and JT5112 MIOS units are examples of reconfigurable tester modules that may be controlled and reprogrammed by JTAG. Examples of the type of test instruments that can be built include serial bus interfaces (SPI, CAN, I2C, E-net) and others such as DDRx memory interfaces.
Software selection for combinational testers
Once you have selected your fixture, instruments and power supplies you will need to select a test executive software solution. Alternatively you might wish to seek the advice of an experienced test system integrator that offers a professional test development service. Either way you will likely have to choose from one of the more popular software options are listed below:-
- National Instruments (NI) software. National offer a range of options from the ubiquitous LabVIEW through to LabWindows/Cvi and TestStand These products are well supported and National ensures that there is a network of software engineers trained in the use of their tools. The LabVIEW graphical programming system was originally designed as a research scientist’s tool but now appeals to many non-programmers in the ATE world. LabVIEW/CVi meanwhile offers a more conventional programmers interface (API) but is not a full implementation of ANSI C. National Instruments software is naturally well supported by a host of instrument drivers often written by the instrument vendor.
- Python – often known as the engineer’s programming tool, Python is praised for having the simplicity of BASIC with many of the advanced features and flexibilities of C. Another major attraction is that it is open-source and thus essentially free. PyVISA is a Python ‘wrapper’ that offers easy access to shared DLLs built to the Virtual Instrument Software Architecture specification laid down in the 90s. This allows high-level control of conventional instruments while JTAG Technologies also provides it’s own library for boundary-scan activities.
- .NET framework is Microsoft’s latest coding environment that provides language interoperability across several programming languages. Programs written for .NET Framework execute in a software environment (as opposed to hardware environment), known as Common Language Runtime (CLR), an application virtual machine that provides services such as security, memory management and exception handling. Languages supported range from C# to VisualBasic.
- Others, Marvin’s ATEasy, Keysight VEE, JTAG Tech’s AEX Manager
Specifications & documentation
Having previously defined a macro-view of what you a looking to test it will now be necessary to provide a detailed test specification per UUT so that the system programmer can set about the task in a logical manner. This is especially important in cases where a third party bureau or systems integrator is involved. Glib requests such as ‘test as much as you can’ are pretty much meaningless and can provoke much future argument and contractual wrangling. Such a test specification will also feature in the acceptance specification of the tester along with checks for compliance with electrical safety and EMC measures.
Documentation of the finished system is another important aspect that can extend the useful life of a tester. A great many carefully engineered systems have been abandoned due to the fact that the lead engineer has left the organisation without leaving adequate documentation.
Commissioning & Maintenance
The commissioning process for a new test system should be the same whether the equipment was developed in-house or supplied via a third party. In each case the system must be inspected for compliance to safety specifications before it is even powered-up. Subsequently the UUT program can be executed with a known good (golden) board to check for false failures. If possible a board with known failure(s) that are covered by the tester should be run through also. Limits checking should be fine tuned to cover the spread of acceptable results.
Maintenance is perhaps the most neglected part of a test system design. In any tester there will be consumable items that will need to be replaced periodically. Test probes, sacrificial connectors and cables will all have a finite life and so it is advisable to replace these items on a schedule rather than waste time tracking down fixture issues as the system ages. Similarly most analogue measuring equipment requires periodic calibration or at least checking against a good traceable instrument with a x4 greater accuracy. With this in mind easy access to tester components for replacement and service must also be considered at the design stage.
Clearly there is a great deal to consider before embarking on a test system build, however there is also a lot to be gained. At some point you will have to decide on the crux of the issue, which is the required diagnostics resolution or how accurately can you pin-point a given fault. It is this fact alone that determines the extent of the hardware needed and the software programs to support it. Better resolution means more instrumentation and also (usually) more test points. However the latter can be mitigated by the use of JTAG/Boundary-scan techniques on designs that support this.
Since JTAG/Boundary-scan uses built-in/designed-in pin access provided by the ICs themselves you can often reduce your hardware overhead by removing test points while increasing your test coverage and adding useful resources such as in-system programming of CPLDs, Flash Devices and embedded memories in µProcessors.
To find out more about ATE systems featuring JTAG/Boundary-scan contact JTAG Technologies..