1

We are searching for a way to get the latest known (scanned/collected/...) logged on users in SCOM (1801). For every device in SCOM, we quickly want to see who were the latest logged on users. Even if a server is not available anymore, we want to know who was logged just before the crash or connection loss.

What we don't want:

  • Collect all logon/logoff events. We are not interested in the history and don't want to waste space in the database. You still have to do a calculation on the event (search the users without logoff). We only want the latest logged on users without history.
  • Create a monitor if there are users logged on. We want to know exactly who is logged in (domain + username).
  • Extend the Computer-class with a property 'LoggedOnUsers' and add this information with a discovery. This would be possible, but this property will be discovered very frequently. Every change would launch a configuration reload and we want to avoid this. If we could ignore the configuration reload, this would be the best scenario.
  • Import all the users as objects in SCOM. This would mean: all domain user; all trusted domain users; all local users of every devices... Not possible. Create a task and get the logged on users on the fly. If a server is down, we still want to know the latest logged on users.
  • Use SCCM (Configuration Manager) for this, because it should be live data (as live as possible).

In my opinion it should look like a property of a computer/device-class, but without the history or configuration reloads. Or a monitor without pre-definied error/warning/... state, but a custom text state (=logged on users).

We also use the SCOM-data in other (custom) applications via SQL-queries on the OperationsManager(DW) databases. If the data is available in the database, exporting it with a SQL-query is no problem. If we get the information in a SQL query, it would also be possible to search the devices where a specific user is logged on.

The method shouldn't be limited to logged on users only. A solution can also apply to other kinds of data.

Does anybody have an idea on how to do this?

AM_Simon
  • 11
  • 2
  • Just a note. There is no way to define a class or a single class property with "no history" attribute. Every single property is in configuration and its history is kept in Data Warehouse. However, yes, there is a way to do what you want. – Max May 15 '18 at 21:53

1 Answers1

0

Ok, so, I see you already have a deep knowledge about SCOM, so if you don't mind, I'd like to give you a fishing rod, not the fish.

The general idea is to collect data to a file, upload the file to SCOM, parse and insert into a DB (presumable DW DB). To do that:

  1. Have a ProbeActionModuleType workflow, which detects each logon (triggered by event, checking all current sessions on timer (https://www.codeproject.com/Articles/133372/Monitoring-of-Logon-Logout-in-Terminal-and-Client), etc.).
  2. This workflow should generate a data file: XML, JSON, etc.
  3. Then the workflow should output a property bag like:

[InputStream(0)] public void OnNewDataItems(DataItemBase[] dataItems, bool logicalSet, DataItemAcknowledgementCallback acknowledgedCallback, object acknowledgedState, DataItemProcessingCompleteCallback completionCallback, object completionState) { if (shutdown) return; try { // collect data // Create JSON, save into temp file and upload it to SCOM var TempFolder = Environment.ExpandEnvironmentVariables("%TEMP%"); char[] pathSeparator = { '\' }; pathReportFileName = TempFolder.TrimEnd(pathSeparator) + "\" + "Report.json"; if (File.Exists(pathReportFileName)) File.Delete(pathReportFileName); File.WriteAllText(pathReportFileName, JsonConvert.SerializeObject(Results)); // return file name to upload Dictionary bagItem = new Dictionary(); bagItem.Add("DataGenerated", "OK"); bagItem.Add("ResultPath", pathReportFileName); Global.logWriteInformation("Returning patching report file at " + pathReportFileName + ".", this); ModuleHost.PostOutputDataItem(Global.CreatePropertyBag(bagItem), new DataItemAcknowledgementCallback(OnAcknowledgementCallback), pathReportFileName); } catch (Exception e) { Global.logWriteException("Failed to ...", e, this); ModuleHost.RequestNextDataItem(); return; } }

In your callback delete the file and request next

protected void OnAcknowledgementCallback(object state)
    {
      try
      {
        Global.logWriteInformation("Patch report inventory file has been sent, deleting.", this);
        string sentFileName = (string)state;
        File.Delete(sentFileName);
      }
      catch (Exception e)
      {
        Global.logWriteException("Failed to delete patch report inventory file.", e, this);
      }
      finally
      {
        ModuleHost.RequestNextDataItem();
      }
    }
  1. Then at your target (which is Windows Computer or so), create a Rule with a Write Action of <WriteAction ID="WA_UploadFile" TypeID="System!System.FileUploadWriteAction" RunAs="System!System.PrivilegedMonitoringAccount">

  2. At Microsoft.SystemCenter.CollectionManagementServer create another receiving Rule with a Data Source of <DataSource ID="DS_GetFile" TypeID="System!System.FileUploadDataSource" RunAs="System!System.PrivilegedMonitoringAccount">

  3. In the receiving rule, I use my own WriteActionModuleType workflow, which parse the uploaded file and save it into DW DB (which is exactly your requirement).

NB! Both the ReadAction, which produces file, and WriteAction, which parses file must support acknowledge callbacks, therefore they MUST be managed code, i.g. written in C#, F#, VB, but NOT PowerShell.

NB! Both sending and receiving rules MUST have ConfirmDelivery="true" attribute.

Best regards Max V.

Max
  • 751
  • 6
  • 10
  • Max, thank you for the information! It seems a bit more complicated than expected (or hoped). I will try to find time to swing this fishing rod into my testlab and catch the fish. – AM_Simon May 16 '18 at 07:31