Our team currently use Selenium and C# (and NUnit) to run UI Automated tests. All tests have been manually programmed, meaning no recorders have been used.
Issue: We now have a request that these tests track their own performance (and past performance) and raise a warning when its runtime increases by x% (5% or 10%, etc).
Question: What would be the best way to accomplish this? Should we create to tool to analyze performance (performance history) of these UI and API tests from scratch or are there other tools out there we can leverage?
Blogs or stackExchange questions discussing load/performance testing usually reference three main tools for C# (NeoLoad, SilkPerformer, LoadRunnerProfessional). However, I'm not sure that what I'm being asked to do is considered performance testing (load testing) in the purest sense and therefore, not sure whether the tools mentioned above will help achieve the overall goal. They also usually separate performance/load testing from UI/API testing.
Summary: looking for advice on what direction to take and/or what to read up on for this type of testing.