Solved

Device DataSource Instance datapoint historical data using RestAPI v3

  • 30 October 2023
  • 25 replies
  • 138 views

Userlevel 6
Badge +11

I am having problems getting the RestAPI to return any data regardless the combination of paths, query params, time filters I try using.  What am I doing wrong?  Here’s my ultimate URL I’ve built for this effort:

$ddsi is a successfully retrieved object (DeviceDataSourceInstance)

Start and End are from these:

    [int]$start = get-date (get-date).addMonths(-3) -uformat %s
    [int]$end   = get-date (get-date).addMonths(-3).AddMinutes(5) -uformat %s

/device/devices/$($ddsi.deviceid)/devicedatasources/$($ddsi.devicedatasourceid)/data?size=500&offset=0&start=$start&end=$end&datapoints=Capacity,PercentUsed

All of the pieces and parts seem to line up with examples I’ve found here and in the LM Docs… it doesn’t error out, but returns nothing.

Goal is to get volume capacity metrics from 3 months ago.

Where am I going awry here?  Everything works up until I add the /data at the end.

icon

Best answer by Cole McDonald 10 November 2023, 23:06

View original

25 replies

Userlevel 7
Badge +19

Make sure it works in postman first. That will tell you if the problem is your request or your script.

Userlevel 6
Badge +11

Make sure it works in postman first. That will tell you if the problem is your request or your script.

I’ve never been able to get postman to work right with LM’s API… even while taking the LMCD :)

I’ve just used my own code for it as a result.

Userlevel 6
Badge +11

Once I get these grabs figured out, I’m going to start building longer term predictive metrics.  Things like 3 month, randomly sampled standard deviations to determine steady or spiky used volume growth.  This will unlock growth metrics like “Days Until Full” and “Desired Capacity” to determine how best to address a volume that is filling up… fix some SQL backup that didn’t complete and delete the previous one… a single users storing a copy of GoT Red Wedding episode video file (that sounds awfully specific Cole)… or nice steady growth that can just be expanded to let it meet a quarter’s growth and allow an account manager to spend some quality time with the clients who don’t want as much interaction, so we can find ways to determine if their technology needs may be changing or suggest other optimizaitons/solutions where needed.

Userlevel 5
Badge +11

Make sure it works in postman first. That will tell you if the problem is your request or your script.

I’ve never been able to get postman to work right with LM’s API… even while taking the LMCD :)

I’ve just used my own code for it as a result.

With the move to Bearer Token for auth, you don’t need a prescript any longer. Which makes using tools like Postman or Insomnia that much easier.

Userlevel 6
Badge +11

I can give it a shot and revisit postman… looks like there’s no longer a purely local option (have to make the free account now).  My tinfoil hat doesn’t like it as much.  Nothing seems to indicate where they’re storing data with the new version… I’ll have to read through their documentation now to verify.

Userlevel 6
Badge +11

I can give it a shot and revisit postman… looks like there’s no longer a purely local option (have to make the free account now).  My tinfoil hat doesn’t like it as much.  Nothing seems to indicate where they’re storing data with the new version… I’ll have to read through their documentation now to verify.

For anyone else who has the same paranoid questions I do: Trust Center | Postman

Userlevel 7
Badge +19

Meh, tradeoffs everywhere. Having postman sync to the cloud makes it so that i never have to start over with postman. All my work is always there, even if i change PCs.

Import collection from URL: https://www.logicmonitor.com/swagger-ui-master/api-v3/dist/swagger.json makes it so that all the documented API calls are available. Set some authentication defaults with bearer tokens on the collection and some collection variables and you should be good to go.

Being able to separate your script from the API call makes a big difference when troubleshooting. Also allows you to very quickly tune your API call to get only what you need.

I assume you’ve seen this:

 

Userlevel 6
Badge +11

Meh, tradeoffs everywhere. Having postman sync to the cloud makes it so that i never have to start over with postman. All my work is always there, even if i change PCs.

Import collection from URL: https://www.logicmonitor.com/swagger-ui-master/api-v3/dist/swagger.json makes it so that all the documented API calls are available. Set some authentication defaults with bearer tokens on the collection and some collection variables and you should be good to go.

Being able to separate your script from the API call makes a big difference when troubleshooting. Also allows you to very quickly tune your API call to get only what you need.

I assume you’ve seen this:

 

The tradeoff is that they don’t state whether your api tokens are stored on device or in the cloud… so when they inevitably get hacked, the bad guys have access to your credentials… and I’ll take a peek through and see what that thread says for me, I know I did a bunch of trying and searching… but if I can’t find a statement of where the data is stored specifically (not just assurances of safety), then I can’t use postman, so it’ll end up as a moot point.

Userlevel 7
Badge +19

They are definitely stored in your account in the cloud. I switched PCs and when i logged into postman, my creds were saved there.

Userlevel 6
Badge +11

I’ve requested explicit statements of this from them as I didn’t find anything in their security documentation about storage locations for different types of data.  I’m paranoid about such things… but have been proven right over and over again throughout my career.  So… I’ll keep poking at it using the powershell code I’ve got.

Userlevel 6
Badge +11

from postman’s support TSE:
 

 

Userlevel 7
Badge +19

There are other options. There are other rest clients that let you do rest calls all locally contained within the app or as an extension in your browser.

Userlevel 6
Badge +11

That’s OK… I don’t need a GUI.  I prefer CLI (1970’s geek).

Userlevel 5
Badge +11

@Cole McDonald I personally use Insomnia.rest. It has a local electron app that I use.

 

While using Curl or Invoke-RestMethod is quick and easy. Sometimes it is just easier to click Go and see the structured data right there.

Userlevel 6
Badge +11

Beyond the toolset for exploration, has anyone had any success accessing historical data via the restAPI via powershell at all?  Here’s the kind of thing I’m asking for:
 

https://<ourthing>.logicmonitor.com/santaba/rest/device/devices/23387/devicedatasources/1010606/instances/19979217/data

returns:

@{dataSourceName=WinVolumeUsageAzure-; dataPoints=System.Object[]; values=System.Ob
ject[]; time=System.Object[]; nextPageParams=}

It does have data:

 

Userlevel 6
Badge +11

Specifically based on this:

 

Userlevel 6
Badge +11

returns:

@{dataSourceName=WinVolumeUsageAzure-; dataPoints=System.Object[]; values=System.Object[]; time=System.Object[]; nextPageParams=}

I don’t have API access to LM to test this, but it looks like your dataPoints and values property does contain an array of items (“System.Object[]”). Are the properties blank? Try piping the return value to convertto-json -depth 100. I find that an easy way to look for nested objects (it’s like groovy’s .dump()).

 

Userlevel 7
Badge +19

@Cole McDonald I personally use Insomnia.rest. It has a local electron app that I use.

 

While using Curl or Invoke-RestMethod is quick and easy. Sometimes it is just easier to click Go and see the structured data right there.

For what it’s worth, you should be able to use the swagger docs themselves to test out API calls. However, this was never explored/explained/enabled by LM since the auth was too complicated for even swagger to decipher. However, now that bearer tokens are a thing, you should be able to put your bearer token into the swagger web page, provide the path and query variables for a call and run the call. That’s what the “Authorize” button is for at the top of the page (that they forgot to disable or couldn’t figure out how to disable). LM needs to turn on a couple flags when generating the swagger docs though (like the ability to specify the portal url, and the “execute” buttons that swagger normally offers), then it should be possible to run all the API calls you want right from the swagger docs.

Userlevel 6
Badge +11

Auth works, but the fields are all un-editable.

Userlevel 7
Badge +19

Right, LM needs to flip some flags when they generate the swagger docs. I just put in a feed for it, but the more the merrier.

Really, there’s no reason, now that we have bearer tokens, to be able to do what swagger does in their example. Notice the “try it out” buttons on each call. 

Userlevel 6
Badge +11

Got it to cooperate using powershell… The filters I was adding pass through two functions I wrote.. at some point, it was inserting the ‘?’ dividing the URL from the queries.  Fixed that… the datapoint names are CaSe SeNsItIvE.  When accessing a single instance, there are no .item objects, so I had to catch for quantity of returns.

I’m ready to move forward with some deeper predictive metrics now :)

Userlevel 6
Badge +11

Is there a way to get the DeviceDatasourceID as a built in variable from the platform?  That would save a ton of API overhead.  Then I can just grab the instances from that and get my historical data using the DDI’s properties.  I don’t find anything in the LM Docs that state this for use in dataSources, I do find them for alerts though.

I can get the deviceid and the instanceid, but not the datasourceid… from the restAPI, I can access the device and the dataSource, but not the instance… so I have to get the devicedatasource to get to the instance no matter how I slice it.  That adds an extra uneccesary API call that could be a single pre-filtered call.

I’ll put in a feedback for the direct instance access as I would prefer that since we seem to have that ID as an instance level token as ##system.instanceid##  I would expect there to be a ##system.devicedatasourceid## that we could reach now to get the necessary IDs for filling out the URL to the instance.

Userlevel 7
Badge +19

Is there a way to get the DeviceDatasourceID as a built in variable from the platform? 

Not that i’m aware.

Userlevel 6
Badge +11

Is there a way to get the DeviceDatasourceID as a built in variable from the platform? 

Not that i’m aware.

Confirmed absent thorugh the support team

Userlevel 6
Badge +11
Success!  Now to optimize before broadening the appliesTo()

I’m successfully reaching the timeseries data from a different dataSource on the same device then using that data to calculate predictive metrics.  I still have to add a couple of calculations, but this is the basic goal I’m shooting for.

Need to optimize the code as well, as it’s currently several API calls happening for each device and I don’t want it to eat the whole TCP ephemeral stack.

Reply