Forum Discussion

casteeb's avatar
casteeb
Icon for Neophyte rankNeophyte
8 months ago

Remote DB connections via Oracle SQLnet

While Logic Monitor seems determined that it needs to "connect" to a remote host to gather data, in the case of Oracle DB up / down testing it's much easier to just perform a remote connection via SQLnet.

Has anyone been able to build remote database login calls from the collectors?

We created Groovy script from a bash script to connect on the Collector via Oracle SQLnet but haven't been able to integrate that into the Collector server. The idea being - we'll add these scripts for each database on each server using or Oracle LDAP DB names which resolve fine in our SQLnet script. Do a simple connect - verify and return OK/ not OK.

This will help us resolve the issues with Logic Monitor failing to understand Oracle RAC and PDB databases. 

 

Thanks,

Ben

 

  • Stuart, sorry I haven't had time to roll back to this. We are actively working with LM support to get this implemented. Once we have the algorithm implemented and working, I'll update again on how it worked out. 

    Appreciate your input. 

  • Anonymous's avatar
    Anonymous

    What do you mean by

    haven't been able to integrate that into the Collector server

    Have you built built the collection script? Does it run from the collector debug? Does it output in LM's prescribed output format? That is putting the cart before the horse since you would need to know script or batchscript. That will depend on discovery. Have you figured out how you're going to get instances into LM (manually or automatically)? I assume it would be multi-instance since you might have multiple DBs per resource in LM.

  • Anonymous's avatar
    Anonymous

    Ok, so you need it to run on the collector. If you're not running it in the collector debug now, how are you running it? Groovy installed on your laptop? Are there custom libraries you've installed on your laptop that you're importing into the script? If so, you'd need to install those jar files on the collector. Mike_Aracic could probably point you to docs on how to do that. 

    • casteeb's avatar
      casteeb
      Icon for Neophyte rankNeophyte

      Yeah, that's the problem, we added the script using the LM Collector GUI, associated it to that server, the script shows up as valid but not associated to run on any host. We are new to LM and still learning how it works. If we could execute it, the rest I think we can figure out. 

      I did try to pre-test the script with Groovy on the Collector server, but while LM affirms that it can run Groovy scripts, that executable is not available system wide, not in the user environment.  I'm hesitant to add another install of Groovy because I don't want to risk a library clash with LM itself.  The sqlnet client and LDAP binaries have all been tested on the server and connect fine to any valid database name. 

      If we can just get the script to run and raise some errors, we'd have something the move forward with.. 

      • Anonymous's avatar
        Anonymous

        If your new datasource isn't applying to any devices, you might consider checking the AppliesTo expression. But have you considered how your instances will get created? Discovery? Batchscript vs. script?

  • Thanks Stuart, we worked on implementing this yesterday. Found a lot of good scripting examples in the built-in Oracle code.

    Added the DB list to the server, added a simple check_oracle Datasource script to the My Module Toolbox on the collector which just echos the list and returns 0 , in check_oracle DS script we set AppliesTo (checked with Launch IDE and validated that resolves to the specific server server name and IP where we put the list). But that Data source is still stuck on the "Not in use" status. I'm missing something ....

    • Anonymous's avatar
      Anonymous

      You should have one DS with two scripts. If you have a script to automatically discover the list of database names (check_oracle), that seems like it would be your discovery script. Then your collection script would actually check to see that each instance discovered in your discovery script can be connected to.

  • Oddly in this case I want to manually add/remove the DB names in a list on each server, so maintaining 1 list on each server should work for us. The auto discovery for Oracle  RAC and PDBs in LM is sort of the reason we got here. lol

    That's not really a problem for use because adding / removing a database requires about 25 Tasks to different functional areas in the data center.

    • Anonymous's avatar
      Anonymous

      Then in that case, I'd go with this discovery script:

      hostProps.get("oracle_rac_dbs").tokenize(",").each{
          (wildvalue,wildalias) = it.tokenize("|")
          println("${wildvalue}##${wildalias}")
      }
      return 0

      And put the list on each server with the oracle_rac_dbs property. Comma separate each db/display name pair and separate the db name from the display name with a pipe character.

      That takes care of your discovery. Does your collection script test one DB at a time or all DBs on a server?

  • Stuart, this solution works, we have it implemented now and I just want to thank you for your help. 

    Ben