Following the fail fast and TDD approach provides quick feedback to the engineer as to whether the change they have made works (or not as the case may be). What is also very helpful is to understand if the code they have written actually builds/compiles consistently. You could of course use manual processes to achieve this but that can be cumbersome, inconsistent, prone to error, slow, and not always fully repeatable.
Implementing automation will help speed things up, keep things consistent, and above all provide confidence. If you are running the same steps over and over again and getting the same results, it's a strong bet that it works and you can trust it. It is therefore beleivable that if you change one thing within your platform and the previously working process fails, there is a very good chance that the change has broken something.
There are plenty of tools available for building/compiling code. All of which do pretty much the same thing—ensure the code is written correctly, ensure all of the external references are available, and if so create a binary to run. As for the tools and technologies one can use to create automated tests, again
there are many different flavors and solutions available. It is worth taking your time to investigate some of what is on offer (or ask the engineering teams what they use/would like to use). Without a little bit R&D one cannot be able to get something up and running quite quickly.
Staying with automated testing there is one thing that puts people off; that being the fact that it can be quite daunting. How much of the platform do you cover? How do you replicate users in the real world? Where do you start? There is no straightforward answer other than keep it simple (KISS). Start with mapping out the primary use cases and create automated tests to cover them. You can always refine these or add more as you go along.
One small snippet of advice regarding automated tests is not to over complicate your test data. Test data can be a bit of a thorny issue and can cause more problems than it solves. A good rule of thumb would be to have the test script create and more importantly tear down the data it needs during execution of the test itself. If one do not follow this, there is high likelyhood that the etst data will be stale and that may well become out of date quite quickly.
So having some sort of automation to build and test your software components. Greater help is in the ability to automate the set up so that we can run it all as and when we need it. That is where continuous integration solutions come into play.​