Extract Instance Level Alert Expression
Hello, I am making massive number of API calls due to the high number of devices and instances in LogicMonitor to Extract Instance Level Expression for each device (VM type). I have all Global alerts but there many devices where we have Instance level alerting. Currently I am using following API to extract instance level alert expression. /device/devices/device_id/devicedatasources/devicedatasources_id/instances/instance_id/alertsettings?size=1000&filter=alertExpr!:"" I have over 26000 devices, to iterate all the devices followed by data sources and eventually instances takes millions of API calls is there any other way we can pull this information. I would need this data every month. I do not think there is something like bulk API endpoints where I can pull larger datasets in a single call. Nor there is Webhooks/Alerting Data Streams where instead of pulling it pushes updates about alert configuration at instance level. I cannot query just Device Groups as instance level alert expression are found only at Device level. Has anyone had this requirement before, any solution or alternative? Thanks Ashish42Views3likes4CommentsReporting on sqlStatusCode in SQL Server Connection Status MSSQLSERVER
Hello, all! I'm a bit new to reporting on LM and am trying to wrap my head around it. I'm trying to make a report on the sql server uptime status from SQL Server Connection Status > MSSQLSERVER. My end goal is to display the percentage of uptime over the past 24 hours, using whatever graph is necessary. I was directed to report on the sqlStatusCode, but that's not necessarily a rule set in stone. Whenever I look into the Raw Data tab, I see a table that has everything I could possibly want on it. However, whenever I try to report using any Widget other than a Custom Graph Widget, it only reports on the most recent event in the Raw Data table. Is there a way to report on the whole table? Or is it possible to export this data? It seems easy in my head: just get a count of every row where sqlStatusCode = 0, then divide that by the total number of rows and multiply by 100. Is it possible to do this?Solved32Views0likes2CommentsUsing LogicMonitor's REST API as a Power BI Source - Part 2
Overview Back in 2020 I shared an article on how to use any of LogicMonitor's REST API methods as a datasource in Power BI Desktop for reporting. That opened a good deal of potential but also had some limitations; in particular, it relied on use of basic authentication that will eventually be deprecated, and it could only make a single API call so could only fetch up to 1,000 records. I've documented how to use a LogicMonitor bearer token instead of basic authentication, but bearer tokens aren't currently available in every portal (just our APM customers for now) and it still faces the single call limitation. In lieu of a formal Power BI data connector for LogicMonitor being available yet, there is another option available that is more secure and a good deal more flexible: using Microsoft Power BI Desktop's native support for Python! Folks familiar with LogicMonitor's APIs know there is a wealth of example Python scripts for many of our REST methods (example). These scripts not only allow us to leverage accepted methods of authentication but also allow combining calls and tweaking results in various ways that can be very useful for reporting. Best of all, using these scripts inside Power BI isn't difficult and can even be used for templated reports (I'll include some working examples at the end of this article). While these instructions focus on Power BI Desktop, reports leveraging Python can also be published to the Power BI service (Microsoft article for reference). Prerequisites Power BI Desktop. You can install this via Microsoft's Store app or from the following link:https://www.microsoft.com/en-us/download/details.aspx?id=58494 The latest version of Python installed on the same system as Power BI Desktop. This can also be installed via Microsoft's Store app or from:https://www.python.org/downloads/windows/ Some basic familiarity with LogicMonitor’s REST APIs, though I'll provide working examples here to get you started. The full API reference can be found at: https://www.logicmonitor.com/support/rest-api-developers-guide/overview/using-logicmonitors-rest-api First, install Power BI Desktop and Python, then configure each according to this Microsoft article: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-python-scripts. When adding the Python modules, you'll also want to add the 'requests' module by running: pip install requests Modifying Python Scripts for Use in Power BI As mentioned in the Microsoft articleabove, Power BI will expect Python scripts to use the Pandas module for output. If you're not familiar with Pandas, it's used extensively by the data science community for analyzing bulk data. Adding Pandas to one of the example scripts can be as easy as adding 'import pandas as pd' to the list of import statements at top of the script, then converting the JSON returned by LogicMonitor's API to a Pandas dataframe. For example, if the script captured the API results (as JSON) in a variable called "allDevices", we can convert that to a Pandas dataframe as simply as something like:pandasDevices = pd.json_normalize(allDevices) In that example, "pd" is the name we gave to the Pandas modules back in the "import" statement we added, and "json_normalize(allDevices)" tells Pandas to take our JSON and convert it to a Pandas dataframe. We can then simply print that variable as our output for Power BI Desktop to use as reporting data. Below is a full Python script that fetches all the devices from your portal and prints it as a Pandas dataframe. This is just a minor variation of an example given in our support documentation. You'd just need to enter your own LogicMonitor API ID, key, and portal name in the variables near the top. #!/bin/env python import requests import json import hashlib import base64 import time import hmac # Pandas is required for PowerBI integration... import pandas as pd # Account Info... # Your LogicMonitor portal's API access ID... AccessId = 'REPLACE_WITH_YOUR_LM_ACCESS_ID' # Your LogicMonitor portal's API access Key... AccessKey = 'REPLACE_WITH_YOUR_LM_ACCESS_KEY' # Your LogicMonitor portal. Example: if you access your portal at https://xyz.logicmonitor.com, then your portal name is "xyz"... Company = 'REPLACE_WITH_YOUR_LM_PORTAL_NAME' # Create list to keep devices... allDevices = [] # Loop through getting all devices... count = 0 done = 0 while done==0: # Request Info... httpVerb ='GET' resourcePath = '/device/devices' data='' # The following query filters for just standard on-prem resources (deviceType=0), so adjust to suite your needs... queryParams ='?v=3&offset='+str(count)+'&size=1000&fields=alertStatus,displayName,description,deviceType,id,link,hostStatus,name&filter=deviceType:0' # Construct URL... url = 'https://'+ Company +'.logicmonitor.com/santaba/rest' + resourcePath + queryParams # Get current time in milliseconds... epoch = str(int(time.time() * 1000)) # Concatenate Request details... requestVars = httpVerb + epoch + data + resourcePath # Construct signature... hmac1 = hmac.new(AccessKey.encode(),msg=requestVars.encode(),digestmod=hashlib.sha256).hexdigest() signature = base64.b64encode(hmac1.encode()) # Construct headers... auth = 'LMv1 ' + AccessId + ':' + signature.decode() + ':' + epoch headers = {'Content-Type':'application/json','Authorization':auth} # Make request... response = requests.get(url, data=data, headers=headers) # Parse response & total devices returned... parsed = json.loads(response.content) total = parsed['total'] devices = parsed['items'] allDevices.append(devices) numDevices = len(devices) count += numDevices if count == total: print ("done") done = 1 else: print ("iterating again") # (for debugging) Print all devices... # print (json.dumps(allDevices, indent=5, sort_keys=True)) # Grab just the data items... items = allDevices[0] # Convert the JSON to a Panda dataframe that PowerBI can consume... resources = pd.json_normalize(items) # Print the dataframe... print (resources) If you run that Python script directly, you'll see it prints in a columnar format instead of the raw JSON returned by the API. It's a good example of how Pandas converted the raw data into a more formal data structure for manipulation & analysis. Power BI Desktop leverages that data directly for ingestion into its own reporting engine, which is a pretty powerful combination. Now let's show how to put it to use! How to create a Power BI report that pulls data directly from LogicMonitor via Python In Power BI Desktop, click the Get Databutton. This can be to start a new report or to add to an existing report. Choose to get data from an "Other" source, choose "Python script", then click the Connect button. Paste in your complete and working Python script, then click OK. (some examples are attached to the bottom of this article) Power BI will run the script. Depending on the amount of data being retrieved this can take anywhere from a few seconds to a few minutes. When it's complete, you'll see the Navigator pane with the name of the Python Pandas dataframe from the script output. Check the box next to the item to see the data preview. If the sample looks good then click theLoadbutton. After Power BI has loaded the data you'll be presented with the report designer, ready for you to create your report. To see the full preview of the data from your portal, click theDataicon to the left of the report workspace. When you need to pull the latest data from LogicMonitor, just click theRefreshbutton in the toolbar. To Convert a Report to a Parameterized Template If you've created a Python-based report and want to save it as a re-useable, parameterized template, we'll first need to add our necessary parameters and enable Power BI to pass those values to the script. With the report active that we want to turn into a template, click theModelicon to the left of the workspace. From there, click the three dots in the upper-right corner of the table generated by the Python script and chooseEdit Query. That will open the Power Query Editor. From here clickManage Parameterson the toolbar. For our example we'll add three new parameters, which we'll call "AlertID", "AlertKey" & "PortalName" (feel free to label them however you choose). For each, enter the respective criterion used for accessing your LogicMonitor API in theCurrent Valuefield. Below's an example of what it would look like when completed. When done click theOKbutton to close the dialog. Next, click the table name in theQuerieslist ("alerts" in our example screenshot) and click theAdvanced Editoroption in the toolbar. You'll see Power BI’sM languagequery for our datasource, including the Python script embedded in it. We're going to edit the script to replace the hard-coded API parameters to use thePower BI parameters wedefined instead. Replace the values of the script'sAccessId,AccessKey, andCompanyvariables with the following, respectively (including the quotes): " & AccessID & " " & AccessKey & " " & PortalName & " Note that those will be inside the single quotes for each of the variables. Refer to the screenshot below for an example of how it would look (the changes have been highlighted). When ready click theDonebutton. ClickClose & Applyon the Power Query Editor to commit our changes. If all looks good, now let's save this as a Power BI template. Click theFilemenu, then chooseSave As. Change theSave As Typeto "Power BI template files (*.pbit)", provide a filename and clickSave. Power BI will prompt to provide a description for your template. Your report is now ready for sharing! When you open the template file, you'll be prompted to enter values for the parameters we configured in steps 2 & 3. Enter those, hit theLoadbutton, and you'll then be presented with the report designer ready to go. Example Files Here are some example Python scripts modified to use Pandas to help get you started: get_alerts.powerbi.py get_devices_powerbi.py Here are some basic, ready-to-use Power BI report templates based on the above scripts: Power BI template for reporting on Alerts Power BI template for reporting on Resources/Devices1.2KViews9likes2CommentsQBR Report - Obtaining Data
Hello community, Is there a way with the API to GET a datapoint like CPU and have it return the average over like 3 months? Would I need to pull all the datapoints from the last 3 months into like excel or a database and then average the data or is there a way straight from the source? The only way I can think of is just running a metric trends report as CSV and just extracting the data but then you get an error if its over 1 month. "You must select the maximum date range as less than or equal to last month. If you require a date range more than the previous month, you must schedule the report."Solved56Views1like2CommentsReports UIv4 now available!
Reports UIV4 can now be toggled on at the top of the Reports screen. Check out some of the new features including: a revamped reports tree navigation new reports listing overview with search easy report action access This is a per user setting. You can also easily toggle between UIv3 and UIv4 to see what’s changed. The Reports support doc has a more in depth view of what’s now available. We would love to hear your thoughts on the new UI and know more about how we can improve!348Views20likes11CommentsExportable/printable reporting
LogicMonitor is great at displaying data on a screen but not so much at exporting or printing it. I’m super grateful forall the hard work my POV team put in to help me find some solutions. They really thought outside the box and we found some options to get the data out of the system that will work for us in the short term. I want to know about some of the specific goals for reporting and exporting that information from LM. I support an ecosystem that is still very reliant on having paper copies of reportingdata and having the ability to email PDF/CSV reports directly from the platform is very important to us. Even something like the ability to put a report file into Google Drive or OneDrive on a schedule would be nice. Having this be an automated process would be feature parity with the monitoring platform we are leaving.76Views12likes2CommentsIs anyone else getting issues creating Bandwidth Reports on Switches?
Currently attempting to generate a Bandwidth Report with more than 10 interfaces causes an error, where previously it would just create a report and provide the “Top 10” interfaces. I am curious if anyone else is having the trouble. Also, Netflow reports generate the same error, logs indicate an inability to create more than 20 graphs. This error is odd, as just a few days ago we had a report run with over 30 graphs without issue. Curious if anyone else has run into this issue.Solved214Views2likes8CommentsLMConfig and Reporting on ConfigSources
Hi Guys, Wejust implemented LMConfig and I'm trying to run a report to see what devices do not have the IOS configuration successfully downloaded. We have some legacy equipment still not Configured for SSH and TACACS accounts.soI'm trying to get out of having to manually hunting through all our devices to determine what ones need SSH configured and what ones need the TACACS account added.Even justbeing able to pull a report on all devices and whatones have the ConfigSources - IOS Configs field would be a big help. See attached pic for a visual Cheers and Thanks in advance Joel23Views0likes2CommentsReport Option - Exclusion
When building a report, I would like to be able to exclude things. There are scenarios for excluding... Exclude groups - We have a lot of device groups. It would be simpler to be able to exclude 1 or 2 groups instead of adding 2 dozen or more groups. Exclude devices that are in an SDT Exclude devices that have Alerting/Monitoring disabled - People are running reports and then asking me why some have "No Data"... that's usually because alerting has been disabled, but it's still in the report because the device matched the criteria. This is causing a lot of "run-around" to cross-reference the reason and give feedback. And then it happens the next day because people forget why.8Views0likes1Comment