Creating software integration process at the beginning seams quite easy. Problems start appearing when it gets more complex while it is not so bullet proof. To build reliable create build process I stick to a few rules.

1. Reliability from the bottom
To build reliable complex system it must have rock solid foundation. If core blocks of build process are buggy and fail from time to time it is not possible to build reliably further components that rely on these base ones. So foundation must be well tested and behave predictable for any failures what leads to next rule. Example: not checked content of installation package can lead to wired failures during testing due to missing components of tested application.
2. Assume the worst

To have reliable system the worst events must be assumed that will occur sooner or later. My experience shows that if some less probable failure is ignored then sooner it occurs. So build system must take into account all possible failures and must react predictable and provide meaningful information what bad thing is happening. Example: fetching some intermediate components from external sources like remote ftp server can fail due to temporary network outage. So at the beginning of build process check if there is access to ftp server. It is better than failing in the middle because of this reason. In such case notification could be sent to buildmaster what happened or wait some defined period of time and try again.

3. Never auto-detect input sources

I do not recommend auto-detecting list of source files for compilation. It can look nice and tempting. Developers do not have to update list of source file in a makefile. But then when some crappy source file sneaks into the folder with sources the build process tries to compile it and link. If both happen to succeed then it is very probably that developed software will misbehave.