Is it possible to filter logs by relative date, e.g. logs created more than an hour ago, or all logs created yesterday. The documentation only lists date filtering with explicitly created date values, e.g. @Timestamp < DateTime("2014-01-03") I would like tot create a filter with a dynamic date. e.g. @Timestap < DateTime.Now.AddDays(-1)
Posted by Brent 3 years ago
Hi , I using Seq 2.4.2 as single user and it's very good. I have noticed that memory usage is always quite high. On average its around 2.3GB a lot of times its been on 5GB and for a little while it was around 11GB. Is there a way of restricting memory usage. Looking at he stats Rate accepted/arrived (last minute) 701/701 Working set (bytes) 2153771008 Threads 54 User thread pool threads available 32764 IOCP thread pool threads available 998 Memory utilization 44% Free space (bytes) 250435215360 cheers Greg O
Posted by Greg O 3 years ago
We have 3 different azure cloud services, each with 2 vms (so total of 6 servers) that we are using with durable log shipping to log to a Seq server that is hosted on an azure vm. As of 1 day ago, some of the vms within these cloud services just stopped logging. Some other vms continued to work just fine. If I reboot the vm that stopped logging, then it will log fine for a few minutes and then stop again. In addition to that, once the logging stops, there seems to be a slow consumption of memory until the box uses all it's RAM and starts to become unstable - at which point we have to reboot it and restart the cycle. The first thing we've done is upgrade everything to the latest - client side is up to Serilog version 1.5.14 and all of the corresponding "Extras" and "Sinks" are at the most recent version compatible with 1.5.x. Also our Seq server is 2.4.2. We are monitoring to see if this change has any effect on the memory issue. So far, even after the upgrade and reducing the batchPostingLimit to 10, we are still seeing our logging stop after a while (seems to be taking longer than a few minutes now, but eventually it still seems to stop, at least on some servers). I enabled the SelfLog and the error we are seeing is this: 2016-01-26T02:59:41 Exception while emitting periodic batch from Serilog.Sinks.Seq.HttpLogShipper: System.AggregateException: One or more errors occurred. ---> System.Threading.Tasks.TaskCanceledException: A task was canceled. --- End of inner exception stack trace --- at System.Threading.Tasks.Task`1.GetResultCore(Boolean waitCompletionNotification) at Serilog.Sinks.Seq.HttpLogShipper.OnTick() ---> (Inner Exception #0) System.Threading.Tasks.TaskCanceledException: A task was canceled.<--- So, the way I read that error is that the server is too busy to handle it? But that is an unvalidated guess because there is nothing to indicate any sort of problem on the server. Seems to be nothing notable in the Seq server logs. Certainly nothing to indicate that it is failing to handle requests. Also, once this error starts happening, it just keeps happening and the server never starts logging again until it is rebooted or the IIS app is recycled. So that is not very "durable" at all! I'm not quite sure how to figure out what our problem is at this point? It feels like maybe our only option is to stop using durable log shipping, but if we've got a server issue, is that going to help? Need some guidance on how to troubleshoot and understand what the real problem is so we can address it correctly. Thanks
Posted by James 3 years ago
Hi, I've been integrating SEQ into one of our existing projects. I've tested this and found all of the SEQ functionality works as expected on my local build. However, when I deploy the latest version to our live server, I have found that nothing is being reported (which includes logging on application_start). I've checked the live version and everything seems to match my development build (DLLs are present, web.config is set to send SEQ requests to our SEQ server). Are there any common things that can prevent a project from posting to a SEQ server (i.e. missing file/reference ect)? Thanks.
Posted by Adam M 3 years ago
Greetings, I am somewhat new to Azure, but have a pretty good grip on most setup required for a new Virtual Machine. So, I have setup a VM in Azure (Resource Manager; new VM; not classic), added proper Inbound Security Rules for the port, opened the port within Windows on the VM, and installed Seq. So far, I can only get Seq to properly bind and load using localhost:5341. I do have a DNS name setup for my Azure Seq VM - let's say it's my-seq-log.eastus2.cloudapp.azure.com. When installing Seq to listen at this URL, the Seq site never loads but the service will start. I followed Nick's docs exactly. I am not sure what else to do here to get Seq running/listening to the Azure DNS name for my VM. I cannot even access it via the public IP+Port. Any troubleshooting help is appreciated.
Posted by Jesse Beard 3 years ago
Hi, I just started using Serilog alongside Seq as my logging platform, and I'm amazed! Congratulations! I've setup Serilog in an MVC5 app using Autofac injection and logging to seq with durable log shipping. Everything works fine except when I try to log exceptions. No exception log shows up on Seq, but it's created in file. When Serilog debug is enabled, I get this error message: [code] 2016-01-09T00:48:31 Received failed HTTP shipping result RequestEntityTooLarge: Maximum raw payload content length of 1048576 bytes exceeded [/code] Did I overlooked something in configuration? Is this configurable? Thanks! João
Posted by João Pereira 3 years ago
I was thinking the following would only return events that match the signal. https://logging/api/events?intersect=signal-193&apiKey=1234 It appears to return everything, is there a way to accomplish this?
Posted by Daniel Powell 3 years ago
I'm running Seq on an Azure instance (Windows Server 2012 R2 Datacenter) and logging with Serilog from a console application running on my local workstation. I have 3 sinks configured - File, Console and Seq. I'm also running on dnxcore50 (just in case you were thinking my setup wasn't dodgy enough). All my events are showing up in console and the file 100% of the time. Seq is only capturing event about 1 in 5 runs, that is it will either capture all the events for the run or none of them. I've set up SelfLog.Out on the loggers, but that logs nothing at all. Has anyone seen anything similar? Is this something I should expect pushing log events across the public web (i.e. would it work reliably if I hosted Seq on my local network)? Or is this probably something to do with Serilog, and I should be asking over there? Thanks.
Posted by [email protected] 3 years ago
I understand .Net 5 has just hit RC1, but are there any plans for a cross-platform version of Seq? I've been starting to work with the 2.0 branch of Serilog and sometimes I like to work on my Mac. It would be nice to not have to run an Azure instance and be able to log to localhost for development.
Posted by [email protected] 3 years ago
Hi! We are using Serilog with Seq sink, latest versions from Nuget The sink is configured to use buffer file and batch limit is 10 For the most part this worked OK, but for a couple of apps I see that sending events to Seq is not happening. For the apps that work we usually see a .bookmark file and 1-2 buffer .json files. For the misbehaving apps though, there are like two dozen buffer files now and the bookmark file is stuck at 10/23 (today is 1/12 already) I do not see any invalid payloads coming in and I have tried rebooting the application pools to no avail. Thankfully, only test environment is affected so far Is there something I can do to fix this problem? Thanks
Posted by Vitalii Biliienko 3 years ago