When I was involved in proving the existence of a then unknown M$ virus it
made sense to shut the system down and apply an integrity check on a static filesystem from a known clean boot environment. Doing this periodically will of course result in regular scheduled downtime. This may be a price which has to be paid for a more secure environment, unless those engaged in root-kit detection mechanisms can somehow guarantee the integrity of their check operating from within the compromised environment. As I don't realistically see any such guarantee as being realistic is 15-30 minutes downtime a day something we may need to accept for a higher integrity environment ?
Copyright © 2017, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds