Tracking dynamic targets without being detected requires not only visual but also acoustic stealth. Whilst visual covertness has been explored to varying degrees for many years, robotic acoustic stealth is still sparsely studied. Our goal is to significantly extend both these concepts by uniquely combining visual and acoustic stealth to maintain continuous line-of-sight observation to a moving natural object of interest, such as wild animals, in outdoor environments without being detected. To achieve this we have been developing and demonstrating solutions around real-time selection of pseudo-optimal monitoring locations of previously unmapped environments which reduce the visual conspicuousness of the robot and offer a substantial opportunity for camouflage and observation. To reduce the robot’s acoustic conspicuousness, it monitors the amplitude, spectral content and periodicy of noise sources in real-time from the environment that offer a high probability of covering (masking) any ego-noise, and are cyclic enough to be predictable. Examples of significant distracting sounds include machinery and vehicles for built environments, vehicles and mobile phones in urban environments, and wildlife and wind for natural environments. Additionally, the robot is capable of learning its own ego-noise in real-time to allow compensation and characterisation of various terrain types. We have demonstrated the combined acoustic and visual stealth approach for covertly tracking a moving target in (ISER Publications) and more recently extended this for the robot to recognise and use shadows as more discreet vantage points (ACRA Publication).