Seq Documentation and Support

Welcome to the Seq documentation hub. You'll find comprehensive guides and documentation to help you start working with Seq as quickly as possible, as well as support if you get stuck. Let's jump right in!

Get Started    
Ask A Question



Performance considerations in a project with a very high event volume per day

Hello, we want to use Seq in a project with a very high event volume per day. There will be (on average) ~1.7 million events per day and an estimated ~0.3KB of data (JSON) per event. The events should be accessible for at least 30 days and archived > 90 days (there is currently no time limit defined). It's an existing .NET project of a customer in the logistics sector with 10+ desktop applications for which we want to replace the text-file logging, so we have it easier to search and analyze the logs. All logs are currently written on a windows network-share. We are already using Serilog for the logfiles, with Log4Net as sink (because Log4Net has been the original logging technology). We are also considering a 2nd instance of Seq server for long-time archiving (event forwarding). If that's not a recommended approach in this case, we would like to export the events for long-time storage (Seq.App.FileArchiveJson). It is very possible that we can reduce the amount of events per day in the future, but for now Seq server would have to handle the amount of events stated above. What i would like to have now, are a few pointers regarding Seq performance in such a scenario. Is it even possible/recommended? What hardware-recommendations could be given (the Seq server would be running in a VM)? And any other advice regarding Seq server performance in such a scenario. Kind Regards, Sven Moelter

Posted by Sven Moelter 2 years ago