Datasource to monitor Windows Services/Processes automatically?
Hello, We recently cloned 2 Logic Monitor out of the box datasources (name ->WinService- & WinProcessStats-) in order to enable the 'Active Discovery' feature on those. We did this becausewe've the need to discover services/processesautomatically, since we don't have an 'exact list' of which services/processes we should monitor (due to the amount of clients [+100] & the different services/solutions across them) After enabling this it works fine & does what we expect (discovers all the services/processes running in each box),we further added some filters in the active discovery for the servicesin order to exclude common 'noisy' services & grab only the ones set to automatically start with the system. Our problem arrives when these 2specific datasourcestartto impact the collector performance (due to the huge amount of wmi.queries), it starts to reflect on a huge consumption of CPU(putting thaton almost 100% usage all the time) & that further leads to the decrease of the collector performance & data collection (resulting in request timeouts & full WMI queues). We also thought on creating 2 datasources(services/processes) for each client (with filters to grab critical/wanted processes/services for the client in question) but that's a nightmare(specially when you've clients installing applications without any notice & expecting us to automatically grab & monitor those). Example of 1 of our scenarios (1of our clients): - Collector is a Windows VM (VMWare)&has 8GB of RAM with4 allocated virtual processors (host processor is a Intel Xeon E5-2698v3 @ 2.30Ghz) - Currently, it monitors 78 Windows servers (not including the collector) & those 2datasourceare creating 12 700 instances (4513 - services | 8187 - processes) - examples below This results in approx. 15 requests per second This results in approx. 45 requests per second According to the collector capacity document (ref. Medium Collector) we are below the limits (forWMI), however, those 2 datasourceare contributing A LOT to make the queues full. We're finding errors in a regular basis- example below To sum thisup, we were seeking for another 'way' of doing the same thing without consuming so much resources on the collector end (due to the amount of simultaneousWMI queries). Not sure if that's possible though. Did anyone had this need in the past & was able to come up with a differentsolution (not so resource exhaustive)? We're struggling here mainly because we come from a non-agent less solution (which didn't facedthis problem due to the individual agentdistributed load - per device). Appreciate the help in advance! Thanks,1.2KViews13likes37CommentsAruba Central Monitoring?
If you are responsible for monitoring Aruba Wireless Access Points, CXSwitches, and/or EdgeConnect SD-WANand might be interested in an official Aruba Central integration, please consider completing this short questionnaire. It’s completely voluntary, confidential, consistsof four multiple-choice questions, and should take less than 1minuteto complete. Thanks for your consideration. https://docs.google.com/forms/d/e/1FAIpQLSd3O8AIMj_aXA24Pcc9q_ZBGDOUmrKFe4_d1aedyEfjVBkn2w/viewform?usp=sf_linkSolved700Views15likes2CommentsIs there a way to export data source to a template file; CSV?
So we are in the final phases of rolling out LogicMonitor and now the daunting process of Alert Tuning has rolled upon us. In our old monitoring solution we has it very much tweaked and customized and overall all alerts were around ~400-600ish. In LogicMonitor we are currently at 13000+. We need to seriously tune up the Datasources and we need a way to show our SME's what each datasource is monitoring and what it alerts on, what thresholds, etc.. Is there a way to export the Datasource's Monitoring Template to a CSV file so that we can reference that and our SME's can then say turn off, adjust etc.. I see in the reports section there is a "Alert Threshold Report" but that lists out every single datapoint instance on a group/resource and we don't want that. We need what the base DS template looks at and uses and applies to each matching resource.600Views0likes6CommentsWindows Services Monitoring with quite a bit more Automation applied
So today we use LM's Microsoft Windows ServicesDataSource to monitor Windows Services. This DS uses Groovy Script and WMI calls under the hood to fetch the service metrics like state, start mode, status, etc... Everything works fine but one of the prerequisites is to go and manually populate the list of Windows services which then the DS parses out as a WILDVALUE variable in the script. You know, go to the device, click on Down Arrow (Manage Resource Options) --> Add Additional Monitoring --> and CHOOSE from the list of Windows Services. Rinse and Repeat and Save. Then the DS goes to work. Well, what if you have a list of over 100 Windows Services you need to add to let's say 20 Windows devices? That would take forever to populate that list manually... That's a problem number 1. Scratch that. This is not really a problem since one can run a PowerShell script (or Groovy Script) to perform this task using undocumented - but working very well - LM API calls. That problem is solved. Next - This list of over 100 Services needs to be *refreshed* every let's say 24 hours to remove nonexistent services and add new ones based on the Regex filter. That's a problem number 2. And again, one can do it programmatically running API calls but this is where I am trying to figure out how to do it. Run my script as a custom PropertySource? I am not really writing Resource Properties, I am updating instance list (Windows Services) within Additional Monitoring on bunch of Resources. Plus PropertySources are applied when ActiveDiscovery is run which is what, every 24 hours? Or should I write custom DataSource that would accomplish this refresh and specify 1 day collection period? Thanks.Solved399Views4likes2CommentsJSON Path capability in a Webpage DataSource
I think the answer to this is gonna be “You need to script it, dummy”, but figured I’d check anyway... I'm working on a new DataSource that pulls/interprets JSON data from a peudo-custom system via HTTP. The system has a status page that lists the status of various components using JSON elements that have this general format: ParameterName&ParameterType Initial idea was that I could use the Webpage collector since it supports JSON/BSON parsing. Issue I’m running into is that the values on most of these JSON elements are string (i.e. “true”/”false”). I set up a DataPoint that can extract that value by putting in the JSON Path like so: $.[##WILDVALUE##].[ParameterName&ParameterType] ...and I can see that I’m getting the “true”/”false” values back when I do a Poll Now. But - as we know - LM won’t deal with strings natively. Workaround I came up with was to get length of the string since true/false are different lenghts. According to sources online, JSON Path should support a calculation of string length. I've also verified that I can do this by pasting my data and my JSON Path expression: $.[ComponentName].[ParameterName&ParameterType].length ...into https://jsonpath.com/ In this parser, the .length function works as expected, and returns the length of the JSON value. However, in LogicMonitor, I'm just getting this (example failure): NAN after processing with json (postProcessParam: $.[ComponentName].[ParameterName&ParameterType].length, reference=, useValue=body) Anyone know if there is a way to make this JSON Path length function work?Solved299Views4likes3CommentsGroovyScriptHelper issues - method missing
I’m trying to build a datasource for our health-check API that returns a system name, status, and response time. I believe I need to set up active discovery to map the system names as the instance keys, then query the API again to get that data. Issue I’m running into is that the GSH methods intended to handle the instancing won’t load, giving me the below error. The actual script does run successfully, it’s just the instancing bit that’s giving me trouble. MissingMethodException: No signature of method: Script1.withBinding() is applicable for argument types: (groovy.lang.Binding) values: [groovy.lang.Binding@781fc692] Possible solutions: setBinding(groovy.lang.Binding), getBinding() com.logicmonitor.common.sse.utils.exception.ScriptExecutingFailedException: MissingMethodException: No signature of method: Script1.withBinding() is applicable for argument types: (groovy.lang.Binding) values: [groovy.lang.Binding@781fc692] Possible solutions: setBinding(groovy.lang.Binding), getBinding() at com.logicmonitor.common.sse.utils.GroovyScriptHelper.execute(GroovyScriptHelper.java:197) at jdk.internal.reflect.GeneratedMethodAccessor17.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at com.logicmonitor.common.sse.executor.impl.GroovyScriptHelperWrapper.execute(GroovyScriptHelperWrapper.java:86) at com.logicmonitor.common.sse.executor.GroovyScriptExecutor.execute(GroovyScriptExecutor.java:75) at com.logicmonitor.common.sse.SSEScriptExecutor$ScriptExecutingTask.call(SSEScriptExecutor.java:212) at com.logicmonitor.common.sse.SSEScriptExecutor$ScriptExecutingTask.call(SSEScriptExecutor.java:155) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) elapsed time: 0 seconds Code-block I’m using for discovery: import com.santaba.agent.groovyapi.http.*; import groovy.json.JsonBuilder; import groovy.json.JsonSlurper; import com.logicmonitor.common.sse.utils.GroovyScriptHelper as GSH import com.logicmonitor.mod.Snippets def modLoader = GSH.getInstance()._getScript("Snippets", Snippets.getLoader()).withBinding(getBinding()) def lmEmit = modLoader.load("lm.emit", "0") // Load LM emit module def api_key = hostProps.get("boomi.client_id"); def api_secret = hostProps.get("boomi.client_secret"); def api_scope = hostProps.get("boomi.api_scope") if (!api_key && ! api_secret) { println "boomi.client_id and boomi.client_secret must be defined."; return 1; } // used to get an OAuth2 token def api_auth_host = "login.microsoftonline.com"; def api_tenant_id = hostProps.get("boomi.azure_tenant_id"); def api_auth_url = "https://login.microsoftonline.com/${api_tenant_id}/oauth2/v2.0/token" // define API path to get info from def api_base_host = hostProps.get("system.hostname"); def api_base_url = "https://${api_base_host}/healthcheck/v2"; def api_x_apikey = hostProps.get("boomi.apixkey"); // get OAuth2 token def auth_token; try { def conn = HTTP.open(api_auth_host, 443); def headers = [ "Content-Type": "application/x-www-form-urlencoded" ]; def postdata = "grant_type=client_credentials&client_id=${api_key}&client_secret=${api_secret}&scope=${api_scope}" def response = HTTPPost(api_auth_host, api_auth_url, postdata, headers); auth_token = ParseJsonResponse(response); } catch (Exception e) { println "OAuth2 Exception: " + e.message; return 1; } // get data from API using token def systems; try { def conn = HTTP.open(api_auth_host, 443); def headers = [ "Accept": "application/json", "Authorization": "${auth_token.token_type} ${auth_token.access_token}", "X-APIKey": api_x_apikey ]; def response = HTTPGet(api_base_host, api_base_url + "/oncloud", headers); systems = ParseJsonResponse(response); } catch (Exception e) { println "API Exception from oncloud: " + e.message; return 1; } systems?.each { val -> def wildvalue = val.systemName def wildalias = wildvalue def description = val.systemName lmEmit.instance(wildvalue, wildalias, description, [:]) } try { def conn = HTTP.open(api_auth_host, 443); def headers = [ "Accept": "application/json", "Authorization": "${auth_token.token_type} ${auth_token.access_token}", "X-APIKey": api_x_apikey ]; def response = HTTPGet(api_base_host, api_base_url + "/onpremise", headers); systems = ParseJsonResponse(response); } catch (Exception e) { println "API Exception from onpremise: " + e.message; return 1; } systems?.each { val -> def wildvalue = val.systemName def wildalias = wildvalue def description = val.systemName lmEmit.instance(wildvalue, wildalias, description, [:]) } return 0; // Functions // Convenience method for making HTTP POST requests to REST API def HTTPPost(host, url, post_data, request_headers, user = null, pass = null) { // Create an http client object def conn = HTTP.open(host, 443) // If we have credentials add them to the headers if (user && pass) { conn.setAuthentication(user, pass) } // Query the endpoint def response = conn.post(url, post_data, request_headers) def response_code = conn.getStatusCode() def response_body = conn.getResponseBody() // Close the connection conn.close(); return ; } // Convenience method for making HTTP POST requests to REST API def HTTPGet(host, url, request_headers, user = null, pass = null) { // Create an http client object def conn = HTTP.open(host, 443) // If we have credentials add them to the headers if (user && pass) { conn.setAuthentication(user, pass) } // Query the endpoint def response = conn.get(url, request_headers) def response_code = conn.getStatusCode() def response_body = conn.getResponseBody() // Close the connection conn.close(); return ; } // Convenience method for parsing REST responses. def ParseJsonResponse(response) { // Successful response? if (response.code =~ /^2\d\d/) { // yes, parse out a JSON object return new JsonSlurper()?.parseText(response.body); } else { // No, error out throw new Exception("There was an error fetching from the API: ${response.code}"); } }Solved266Views2likes2CommentsTesting groovy script on local ubuntu machine
I am searching for a way to run groovy scripts on my local ubuntu (WSL) while being able to work with the same classes that are used by the collectors. To achieve this I installed a java+groovy and deployed a collector. However I am unable to import the classes like“com.santaba.agent.groovyapi.snmp.Snmp” Is there any way to achieve this? Or can I somehow locally access the collector shell to test groovy code without the need to use the LM web UI? So, I am basically searching for a way to develop scripted DataSources using CLI, without fiddling with the web UI. Any guidance very much appreciated.Solved138Views17likes9Comments