How to Integrate Your SAP Log with a Splunk SIEM?
In this article you will learn how to:
How to activate the SAP Security Audit Log
Forward your SAP NetWeaver Audit Log to a Splunk Indexer (no need for any third party adapters, add-ons and tools)
Parse the SAP Audit Log in Splunk
Create Your First SAP SIEM Use Case - "Detect Account Brute-Force Attacks"
Enable Alerting for the created "Detect Account Brute-Force Attacks" SIEM use case
Introduction
Ingesting your SAP NetWeaver and S/4 logs into your corporate SIEM solution may be a challenge sometimes. Especially if you are trying to ingest the logs directly from the SAP system into your SIEM. There are different third party log adapters and tools which will "translate" SAP log events and generate alerts in your SIEM solution, but often those are quite costly and represent an additional layer which we need to maintain. Therefore, ideally we will be able to integrate our SAP systems directly into our SIEM infrastructure without additional third party license cost and maintenance effort.
In an SAP system you would like to monitor different types of logs like ICM/ICF logs, Gateway Logs, Security Audit Logs, and so on.
Some of the SAP logs have more common log syntax - e.g. ICM/ICF logs (transaction SMICM) which uses common Apache log format allowing flexible customization, which can be more easily ingested into a SIEM solution. Here is an example of the ICM HTTP access log.
A highly important SAP NetWeaver log for a SOC/SIEM is the SAP Security Audit Log (a.k.a. SAL). The majority of your SAP SIEM use cases will be based on the SAP Security Audit Log as it provides important SAP-specific insights into security-relevant events. Here the logs look quite differently. Below is a raw example of the log tail.
The SAP security audit log (SAL) is one single line of log entries of 200 characters each which are quite difficult to understand or parse. One single log entry would look like that:
The purpose of this post is primarily to get these funny-looking logs into our Splunk instance, parse these correctly and make sense of the events we see there. Finally, we will create a simple use case in Splunk to create an alert whenever Splunk sees a possible SAP account brute-force attack.
Configure SAP NetWeaver to Generate Audit Log Events
By default the audit log on your SAP system is deactivated, therefore needs to be enabled it before we can log relevant events.
Logon with an administrative user to your SAP system and execute transaction RZ10. Select the correct profile on you system, choose "Extended maintenance" and click on "Change"
Go on and validate that the relevant SAP profile parameters are set correctly - if not, adapt or add the parameters as follows:
rsau/enable = 1
rsau/integrity = 1
rsau/selection_slots = 10
rsau/user_selection = 1
While we are at it, go on and create a new profile parameter "rsau/ip_only" - this will help us identify client IP addresses and thus be able to map audit events to IP addresses later on.
rsau/ip_only = 1
lso, note the following two parameters which define the location on the file system. Here the audit logs will be stored as also the file name convention used for the audit log files. Adapt these if you need to - for the sake of this tutorial we have set DIR_AUDIT to "/SAL". Note that the SAP system service account called "<SID>adm" needs to have read/write file system permissions to the directory.
DIR_AUDIT - directory where the security audit logs are stored
FN_AUDIT - file name convention of the security audit log files
Restart your SAP system for the profile parameters to be applied.
After the restart we have switched on the security audit log, but we are still not logging any events. We will need to choose the types of events that we want to log.
Logon to your SAP system and execute transaction RSAU_CONFIG.
Here we can perform two types of configurations: "Dynamic Configuration" and "Static Configuration". In these two categories we define "Filters" and in each filter we specify the type of events, the the SAP client, the user, etc. for which we want to log the events.
Dynamic Configuration
The general properties of the dynamic configuration are:
Configurations are active directly after saving
No need for SAP system restart - changes are applied ad-hoc
Dynamic Configuration overrides Static Configuration
The filters (the event logging) are active until we deactivate them or they get automatically deactivated after a system restart.
Suitable for tracing events where we adapt our rules through testing or trial and error.
We can use it to fine-tune logging and transfer the configuration to a "Static Configuration".
Static Configuration
The general properties of the static configuration are:
Configurations are persistent
Static configurations are applied once the SAP system is restarted
For the purpose of this tutorial we will create a Filter "Filter 01" under the "Static Configuration" configuration tree structure and will activate logging for all audit events, all users and clients.
Right-mouse-click on "Profile 1" under "Static Configuration" and select "Create Filter". Then select the "Filter 01" and do the following configurations:
Select checkbox "Filter for recording active"
Set "Client" to " * "
Set "User" to " * "
Select "Classic event selection" and in the drop-down "Select by Priority" choose "All"
Select all checkboxes under "Selection Audit Classes"
Click on the "Save" icon (Ctrl+s) and you are done.
In order for the configurations to be applied, we will need to restart the SAP system.
With the current configuration we will be logging all audit events for all SAP system clients and all users. It is highly debatable if this is a reasonable configuration on a productive SAP system because of the large amount of logs generated. We would argue that it is feasible to do that on a productive system as well. A productive system with 50 000 users is expected to generate roughly 50 to 100 GB of logs per month. Log storage for a whole year would yield roughly 1TB of log data. Consider that also archiving can bring down the required disk space by a factor of 100. Optimizations are surely possible and desirable (e.g. reducing the number of logs generated by the internal SAPSYS user account or the number of RFC communication logs).
After SAP has restarted, validate that your audit log is already active and events are being logged. Go to the audit log directory (defined in the profile parameter DIR_AUDIT) and check the contents of the most recent audit log file - you should be seeing already audit log entries.
Create a New Index and Configure a Data Receiver on the Splunk Indexer
Go to the Splunk Indexer and create a fresh new clean Index to collect the SAP Audit Logs in it - with a new index you will have a good separation between SAP and non-SAP logs as well as control over Index size and Index management. Go to the "Settings" menu and navigate to "Indexes". Click the green button "New Index", name the new Index sap_auditlog and click on "Save".
On a fresh new Splunk Indexer installation you will need to configure a receiving data port. Go to the "Settings" menu on the Splunk Indexer and click on "Forwarding and receiving". Then click on "Configure receiving" and click on the green button "New Receiving Port". Enter a receiving port and click on "Save" - we will use the standard port 9997 for this example.
Forward SAP Audit Logs to the Splunk Indexer
It is assumed that you already have a Splunk Indexer installed on a different system which is located in a network segment where connectivity from the SAP system to the Splunk Indexer is possible on the respective designated port (default port 9997).
On the SAP System
In a large enterprise environment with dozens/hundreds of SAP instances, the most reasonable way of distributing configurations across the SAP landscape would be over the Splunk Deployment Server (Go to Splunk Web UI >> System >> Distributed Environment >> Forwarder Management).
For the purposes of our test setup we will locally and manually install a Splunk Forwarder on our SAP instance.
Go on and download the Splunk Forwarder installation binary for your Linux distribution. Here is an exemplary command which needs to be respectively adapted to download the right distribution and the most current version of the Forwarder.
wget -O splunkforwarder-Linux-x86_64.tgz 'https://www.splunk.com/bin/splunk/DownloadActivityServlet?architecture=x86_64&platform=linux&version=xxxxx&product=universalforwarder&filename=xxxxx.tgz&wget=true'
Unzip the Splunk Forwarder
tar xvzf splunkforwarder-xxxxx-Linux-x86_64.tgz -C /opt
Start the Splunk Forwarder:
opt/splunkforwarder/bin/splunk start
Go on and enable also the Splunk Forwarder service to boot at system start so that you do not have to start the service every time after reboot manually.
/opt/splunkforwarder/bin/splunk enable boot-start
Now we have the forwarder installed, but we are still not forwarding the SAL to the Indexer.
Go back to the SAP system and configure the Splunk Forwarder to forward the SAL to the new index on our Splunk Indexer.
Execute the following command to add the Splunk forward server (your Splunk Indexer) on the SAP server by substituting the IP address with the correct address or DNS name of the Splunk Indexer.
/opt/splunkforwarder/bin/splunk add forward-server 10.10.10.10:9997
Use the command to list the Splunk servers to which the host is forwarding logs:
/opt/splunkforwarder/bin/splunk list forward-server
Now configure the "inputs.conf" and "props.conf" Splunk configuration files for the SAP system to forward the Audit Log to the Splunk Indexer. Navigate to the Splunk configuration file directory - in our example /opt/splunkforwarder/etc/system/local/.
Edit "inputs.conf" - the configuration file should contain the directory where the SAP audit log is located, the new Splunk index which we created and the sourcetype which we define to be "sap:auditlog".
[default]
host = <sap_system_hostname>
[monitor:///SAL]
index = sap_auditlog
sourcetype = sap:auditlog
Edit "props.conf" - the file should contain the following definitions for our sourcetype:
[sap:auditlog]
CHARSET=UTF-16LE
NO_BINARY_CHECK=false
detect_trailing_nulls = false
Go on and restart the Splunk forwarder on the SAP system.
/opt/splunkforwarder/bin/splunk restart
With that, we are done with the configurations on the SAP system side and you should be already seeing unsorted and non-parsed SAP audit log data in your Splunk Indexer.
Troubleshooting
Here are some issues that may occur in the process and how you can fix these.
File System Permissions on Log Files
The (OS service) account under which the Splunk Forwarder runs as, needs read access to the SAP audit log directory "/SAL" therefore you will need to grant/change the permissions of the directory/files or you will need to add the Splunk forwarder service user to the "sapsys" group for example. Do not run the Splunk service with "root" or the "<SID>adm" user account.
Resend log events from SAP system to the Splunk Indexer
It may be helpful to trigger a resend of all log files on the SAP system. To do this remove all files from the Splunk fishbucket directory on the SAP system and restart the Splunk Forwarder service.
rm -R /opt/splunkforwarder/var/lib/splunk/fishbucket
/opt/splunkforwarder/bin/splunk restart
SAP Audit Log Parsing
The SAP security audit log is currently one big line of log entries which need to be sliced appropriately into fields. Go to the Splunk Indexer under /opt/splunk/etc/system/local and edit the file "props.conf". Add the following configurations:
[sap:auditlog]
category = Custom
BREAK_ONLY_BEFORE_DATE =
LINE_BREAKER = ([23])[A-Z][A-Z][A-Z0-9]\d{14}00
TIME_PREFIX=\w{3}
TIME_FORMAT=%Y%m%d%H%M%S
MAX_TIMESTAMP_LOOKAHEAD = 14
SHOULD_LINEMERGE = false
TRANSFORMS=split
Now we need to add the "split" transformation which will identify the single log values and will add a "|" (pipe character) between each parameter of one log entry. Open the "transforms.conf" file under "/opt/splunk/etc/system/local" on the Splunk Indexer and add the following configurations:
[split]
DEST_KEY=_raw
SOURCE_KEY=_raw
REGEX = ^(.{3})(.{8})(.{6})(\w\w)(.{5})(.{2})(.{3})(.)(.)(.{8})(.{12})(.{20})(.{40})(.{3})(.)(.{64})(.{20})
FORMAT=$1|$2|$3|$4|$5|$6|$7|$8|$9|$10|$11|$12|$13|$14|$15|$16|$17
Take the example of this log entry.
AUW20200616081842001684400004B4 DDIC RSDBA_DBH_SETUP_UPDATE_CHECK 0001RSDBA_DBH_SETUP_UPDATE_CHECK&
This will be matched as follows, allowing us to identify the different values in one log entry:
Now we will need to create a new Splunk App. Go to "Apps" and click on "Manage Apps" and click on "Create app".
Enter the details of the new app like App "Name" - "SAP Security Audit Log" and "Folder name" - "sap_auditlog". This will generate on the back the respective directory and file structure for the new Splunk App.
Go to the directory of the new App "SAP Security Audit Log" and navigate to the already existing "local" folder - /opt/splunk/etc/apps/sap_auditlog/local
Create a new "props.conf" file with the following content:
[sap:auditlog]
category = Custom
REPORT-SAL = REPORT-SAL
EXTRACT-SAL-A = ^.{130}(?<VarA>.*?[^&\|]*)&*
EXTRACT-SAL-B = ^.{130}.*?&(?<VarB>.*?[^&\|]*)&*
EXTRACT-SAL-C = ^.{130}.*?&.*?&(?<VarC>.*?[^&\|]*)&*
EXTRACT-SAL-D = ^.{130}.*?&.*?&.*?&(?<VarD>.*?[^&\|]*)&*
EXTRACT-SAL-E = ^.{130}.*?&.*?&..*?&.*?&(?<VarE>.*?[^&\|]*)&*
EXTRACT-SAL-F = ^.{130}.*?&.*?&.*?&.*?&.*?&(?<VarF>.*?[^\|]*)
LOOKUP-auto_sm20 = sm20 message_id AS message_id OUTPUTNEW audit_class AS audit_class event_class AS event_class new_in_release AS new_in_release log_message AS log_message
The EXTRACT messages will recover dynamic sub-messages (VarA, VarB, VarC , VarD, VarE and VarF) and the LOOKUP will map message IDs to additional metadata in sm20 which will discuss shortly.
Create also a new "transforms.conf" file in the new App under "/opt/splunk/etc/apps/sap_auditlog/local" with the following content:
[REPORT-SAL]
DELIMS = "|"
FIELDS = "message_id","date","time","term1","os_process_id","term2","work_process_number","sap_process","WP","term3","user","transaction","app","client","term4","message","src"
[sm20]
batch_index_query = 0
case_sensitive_match = 1
filename = sm20.csv
Now we want to get a bit more information into Splunk by mapping the message IDs to details which the SAP system would provide as well if you review the logs in SAP transaction SM20. In the "transforms.conf" and "props.conf" above we have already prepared the configuration and we only need to add the lookup file with the respective message IDs and additional data like, audit class, event class, message, etc.
To do this go to the "SAP Security Audit Log" App directory under "/opt/splunk/etc/apps/sap_auditlog/" and create a new directory called "lookups".
Then download the following file and save it under the new "lookups" directory under the filename "sm20.csv"
The first line of the CSV file contains the field names which should match our "props.conf" configuration above so that the additional metadata is being mapped appropriately in our Splunk events. Here is a head output of the file:
We are DONE! Now logs should be parsed appropriately and we can do all the good things Splunk can offer with the SAP security audit log.
Troubleshooting
Here a few helpful troubleshooting tips.
Drop data and start from the beginning (test systems only!)
Stop Splunk Indexer
/opt/splunk/bin/splunk stop
Drop all events in the index
/opt/splunk/bin/splunk clean eventdata -index sap_auditlog
Start Splunk Indexer
/opt/splunk/bin/splunk start
Resend Logs from SAP system (execute on SAP host)
rm -R /opt/splunkforwarder/var/lib/splunk/fishbucket
/opt/splunkforwarder/bin/splunk restart
First SAP SIEM Use Case - Detect Account Brute-Force Attacks
On the basis of the SAP security audit log we will generate our first simple SIEM monitoring use case to identify and detect password brute-force attacks on our SAP systems.
A simple approach would be to monitor logs for frequently reoccurring AU2 message IDs which mean "Dialog Logon Failed".
Enter the following search query in Splunk:
index="sap_auditlog" message_id="AU2"
We already see all the failed logons which occur on the SAP system almost in real time.
Create a Brute-Force Splunk Alert
For test purposes we will create a Splunk email alert as soon as someone generates 3 or more failed logins from the same source (IP). We will use the following Splunk query:
index="sap_auditlog" message_id="AU2" | stats count by src | search count > 2
Go on and save an alert based on the query.
Give a name for the alert and a description. Set the "Alert type" to "Real-time" (consider this is expensive - you would rather use a scheduled alert in production). Add a "Send an Email" as a "Trigger Action" and an E-mail address to which the alerts will be sent.
Save the alert and we are done. The next time the SAP system receives more than 2 failed logins in a short period of time from the same client/IP address, an email alert will be triggered to the respective contact person who can process the SIEM event.
Further SAP SIEM Use Cases
Based on the SAP Security Audit Log we can deduce dozens of meaningful SIEM Monitoring use cases for our SAP landscape. Examples are:
System logons with emergency and standard user accounts
Assignment of overprivileged profiles (e.g. SAP_ALL)
Assignment of critical authorizations (e.g. S_DEVELOP)
Execution of critical function modules and transactions
Change of user type (e.g. Dialog vs. System)
etc.
You need support developing SAP SIEM Monitoring use cases? Contact us - we can help you develop relevant SAP SIEM monitoring use cases for your enterprise.
Thanks
To Frank Buchholz who has posted numerous great articles about SAP logs - e.g.:
To Andreas Siegert ( afx@afximages.com ) who has posted great Splunk tutorials on the topic of parsing SAP logs:
https://community.splunk.com/t5/user/viewprofilepage/user-id/177530
You need support your SAP systems? Contact Us! We help enterprises plan secure IT infrastructures, assess IT services, mitigate risks and run compliant operations.
Related Articles:
You found the content of this post useful? Register for our Newsletter below to receive email notifications about new posts like this one.