Skip to main content
Top-Innovator 2026 – top100.de
  • Home
  • Dichtungstechnik
  • Glossar
  • Helium Leak Test
  • Helium Leak Test

    Definition and Basic Principle

    A helium leak test (also called a helium leak detection test or a tracer gas leak test) is a leak-tightness test in which helium is used as a tracer gas to find and quantify leaks in components, sealing points, or assemblies. The central result is the leak rate. It describes how much gas flows through a leak per unit time. In sealing technology, it is frequently expressed in mbar·L/s.


    Why is helium used specifically? Helium is chemically inert and therefore generally does not react with materials or media. Moreover, it is highly selective in measurement instruments, because it can be cleanly separated from other gases in the mass spectrum. As a result, very small leak rates can be detected, which matters for demanding sealing systems and quality-critical applications.

    Typically, a mass spectrometer leak detector (MSLD) is used. This instrument identifies helium by its mass and converts the signal into a leak rate. The basic procedure is always similar: helium is brought to or into the test object, and the instrument detects whether and how strongly helium passes through a sealing point or a defect.

    Leak Rate and Detection Limit

    The leak rate is a quantitative parameter that makes tests comparable. It allows a requirement from a specification to be checked directly, for example for a seal in a housing, in a threaded connection, or at a valve seat. The detection limit describes how small a leak rate can be while still remaining reliably detectable under real conditions. This limit depends not only on the instrument but also on the setup, the environment, and the background signal.

    In practice, the helium leak test is often significantly more sensitive than simple methods such as soap-water tests or coarse pressure tests. By contrast, it is more elaborate, because helium has to be applied in a controlled way and influences such as ventilation, residual helium, and test duration can noticeably alter the result.

    Test Methods and Selecting the Right Approach

    Which test method fits depends primarily on whether the test object can be evacuated, which leak rate is required, and whether an integral leak rate (entire assembly) or the local leak location is to be found. In sealing technology, this selection is decisive, because sealing points often consist of several possible paths — for example flange faces, O-ring grooves, feedthroughs, or porous transitions.

    Test method How does it work? Strengths Typical limitations
    Vacuum integral test Measurement side under vacuum, helium applied externally to potential leak locations Very high sensitivity, clear quantification Requires vacuum technology, geometry/connection must fit
    Sniffer test Test object pressurized with (helium) overpressure, scanned externally with a probe Leak location can be localized, no vacuum required at the test object Lower sensitivity, sensitive to airflow
    Bombing/accumulation Component exposed to a helium atmosphere, then escaping helium is measured Suitable for hermetic components/packaging Time-dependent, interpretation requires experience

    Vacuum Integral Test (Outside-in / Integral)

    In the vacuum integral test, the measurement side is connected to a vacuum system, or the test object is placed in a test chamber. After this, helium is guided to the outside or to defined zones of the potential leak locations. Through the pressure differential, helium is transported through any leak toward the measurement instrument, where the MSLD detects it.

    This operating mode is particularly suitable when very low permissible leak rates have to be verified — for example for tightly sealed housings, near-vacuum systems, or safety-critical sealing points. In sealing technology, the advantage is that an integral statement on the entire sealing function is obtained, provided the test boundary is clearly defined.

    Sniffer Test (Pressure-Based)

    In the sniffer test, the test object is pressurized with helium or a helium mixture. A sniffer probe is then guided externally along the sealing points and measures the local helium concentration in the surrounding air. As a result, it is often possible to clarify directly where helium escapes — for example at a sealing edge, at a joint, or at a feedthrough.

    Sensitivity is usually lower than with the vacuum method, because escaping helium mixes with air immediately. Air movement, exhaust airflow, and probe handling strongly influence the signal. For many applications in assembly and final testing, the method is nevertheless practical, because no vacuum connection is required and leak detection on real assemblies remains possible.

    Bombing/Accumulation (Special Case)

    In bombing, a component is placed in a helium atmosphere for a defined time so that helium can penetrate cavities or packaging. After this, the test checks whether helium escapes again. This principle is frequently used for hermetic components or encapsulated units where classic spray or sniffer methods cannot be applied directly.

    For seals, bombing is mainly relevant when the test object has to demonstrate a barrier function over a longer period and when the question goes more in the direction of “does helium remain trapped or not?”. The time component is an essential part of the evaluation here.

    Interpreting Measurement Results: Leak vs. Permeation and Background Signal

    In sealing technology, misinterpretations occur particularly often, because sealing systems frequently contain elastomers (e.g., O-rings) and plastics. These materials can let helium pass not only at a geometric defect but also via diffusion. In addition, helium is present as a small fraction of ambient air, which influences the baseline level of the measurement signal.

    A clean result therefore only emerges once it is clear whether the measurement signal corresponds to a leak path or to permeation and outgassing. Especially at very small leak rates, this distinction decides whether a sealing point counts as “leaking” or whether the system shows a measurable helium permeability for material reasons.

    Leak (Passage) Versus Permeation (Diffusion)

    A leak is a real passage — that is, an opening or defect through which gas flows. This can be a damaged sealing edge, folding, an assembly error, or insufficient surface pressure. The signal often reacts relatively directly when helium reaches the leak location.

    By contrast, permeation means that helium migrates through the material, without any geometric leak being present. With elastomers and some plastics, helium can diffuse into the material and later escape again on the measurement side. This behavior often appears as a time-delayed or slowly rising signal. For the assessment, it is then decisive whether the specification requires actual leak-tightness or whether material-related gas permeability is permissible.

    Background Signal and Typical Practical Errors

    A stable measurement value requires a controlled background. Helium is present in the air, and in test environments the helium fraction can rise further due to previous tests. In addition, seals can store helium and release it later, which appears as outgassing and can mask small leak rates.

    Common causes of signals that are difficult to interpret include:

    • too long or too close spray times in the vacuum test, causing helium to “load up” locally,
    • insufficient ventilation or unfavorable airflow during sniffing,
    • contamination from helium left over from previous tests, especially on test benches and fixtures,
    • missing waiting times, even though sealing materials first absorb helium and release it later with a delay.

    Quality Assurance: Requirements, Calibration, and Documentation

    Whether a leak rate is “good” or “bad” follows from the requirement. This usually comes from a product specification, from safety targets, or from a normative framework for tracer gas testing. In sealing technology, the permissible leak rate is often derived from the permissible emission, the function (e.g., pressure retention), and the risk associated with media release.

    For measurement results to be reliable, a defined calibration and traceable documentation are needed. Test reports should include at least the test method, the test parameters, the tracer gas used, the measurement instrument, the background signal, and the leak rate result. As a result, it is clear under which conditions the measurement was taken and how reproducible it is.

    Calibrated Leak, Drift, and Measurement Uncertainty

    A calibrated leak (also called a reference leak) is a defined, known leak rate used to verify and adjust the MSLD. This matters because measurement instruments can drift and because sensitivity can change with the setup, with contamination, or with ambient conditions. For small leak rates, a low and stable background is also decisive, since an elevated baseline degrades the effective detection limit.

    When the required leak rate is close to the detection limit, calibration status, repeatability, and ambient conditions should be controlled particularly strictly. In critical cases, specialist consultation on the choice of test setup and limits is sensible.

    Color contrast
    Text
    Highlighting content
    Zoom