I feel like I'm asking something pretty obvious, but I sincerely can't find it in documentation. Does Seq support automated data export to other data storages? E.g., we'd like to run some analytics over the gathered events, but we'd like to do it in our BI tool, and we'd like the data to be moved from Seq to the BI daily (or hourly). How do we approach this?
Posted by Sergey Rogovtsev 2 years ago
Currently the time displayed for the log entry appears to be the time it was received. This is a problem when you have multiple systems all logging as the logs can be quite out of order even when the machine clocks are perfectly in sync. Is it possible to provide the time that should be used for the log time?
Posted by Shane Courtrille 2 years ago
I just purchased an enterprise license for Seq and I'd like to understand how the restrictions on the number of servers/instances associated with that license. We host multiple instances of Seq on one server, in that scenario do those instances count against our license limit or is it just the physical server itself that counts?
Posted by Mark Perna 2 years ago
For reasons that seemed better at the time, I have some properties with a dot in the name. For example: @Properties["Request.StopWatch"] > 10000 Which works fine. However I do not know the syntax for SQL like queries, e.g.: select @Properties['Request.StopWatch'] from stream limit 1000
Posted by N Armer 2 years ago
I have now dozens of applications logged by Seq and I have created a lot of signals (some for using for apps (for sample e-mail on exception) and some for filter the events quickly.) It should be nice if we can have hierarchy in the signals. Not Something too complicated, but just a 2 levels hierarchy. Folders with inside some signals, and when we must select a signal (in an app instance config), the name of the signal should be "Folder - Signal". I think it's not necessary to have more levels. Regards Valentin
Posted by Valentin Bornand 2 years ago
Hey Nick. It's actually a bug report ;-) I just ran the latest installer (3.2.16) on a machine without Seq, and on the last step I mashed the button to browse to Seq. The installer closed and Explorer opened, but localhost:5341 did not. It's ok though I figured out the port number myself. Cheers Ben
Posted by Ben Scott 2 years ago
Hey, How can I provide durable logging by specifying a JSON file that's accessible by multiple threads? Now by default the bufferfile is only accessible by one thread, but my application can log from different threads using the same logging instance. Any idea on how to solve this problem? Thanks!
Posted by Stijn De Sloovere 3 years ago
Hey, When I enable durable logging and try to log an event, using Serilog, that exceeds the systems defined event size, the event is still logged to the JSON file and Serilogs SelfLog reports an error. That part is fine by me but then Serilog continuously tries to write the event to Seq which fails every time and therefor Serilog keeps posting the same message (" RequestEntityTooLarge: ....) to the SelfLog. I would expect that the serilog wouldn't keep on trying to log the oversized event in the JSON file to Seq. Is there any way to solve this? Thanks!
Posted by Stijn De Sloovere 3 years ago
Hello, I have a server with many applications, and I want to use Seq forwarder on it. My applications use api keys, but the seq server don't receive them, and if I put one in the seq forwarder configuration, all my events have the same key. I understand that the forwarder don't have the list of the api keys and cannot filter the requests, but it should forward the api keys to the seq server. The "applied properties", "filter", and "minimum level" are great and should be available even with the forwarder.
Posted by Valentin Bornand 3 years ago
We have been running Seq in a production manner for a few weeks and noticed that our retention policies never worked. As a test, we set an aggressive policy to delete all events after 1 day. Disk storage continues to grow, and the Diagnostics tab tells us we have something like 2.25 recorded days (depending on when we check). Eventually, we run out of disk space and get hosed. I have tried: - simple service restart - server reboot - deleting the entire data folder (Extents, AppData, Packages, etc) and doing service restart None of these things have worked. We are running on Windows Server 2012 R2 and the service runs as "Local System". Am I missing something? Is there anything I can do/run to help figure this out? Thank you!
Posted by Phil Johnson 3 years ago