Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Preliminary study on Agent Transmission Mechanism of OSSIM Sensor

2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >

Share

Shulou(Shulou.com)06/02 Report--

The main responsibility of OSSIM Agent is to collect all the data sent by various devices that exist on the network, and then send the collected data to OSSIM Server,Agent in a standard way, and then normalize the data before sending it to Server. This paper mainly discusses how to send data in an orderly manner and how to complete normalization.

The OSSIM sensor converts the communication protocol and data format between the OSSIM agent and the OSSIM server through the GET framework. Let's take a brief look at the ossim-agent script:

#! / usr/bin/python-OOtimport syssys.path.append ('/ usr/share/ossim-agent/') sys.path.append ('/ usr/local/share/ossim-agent/') from ossim_agent.Agent import Agentagent = Agent () agent.main

Here, GET is required to transport data to the OSSIM server as an OSSIM proxy. The two main actions required to achieve tight integration are "generating" (or "mapping Mapping" for) OSSIM-compatible events) and "transferring" such data to OSSIM's server. The two components of its GET framework responsible for such operations are EventHandler and Sender Agent, as shown in figure 1.

Figure 1 integrating Get framework content into OSSIM

The main task of Event Handler is to map the events collected by the data source plug-in to the OSSIM standardized event format for SIEM instance alerts. To perform such a process, the original message undergoes a transformation from RAW LOG to the existing normalized data field format; in the figure above, we represent these mechanisms as "normalized normalization" and "OSSIM messages." Partial log normalization code:

From Logger import Loggerfrom time import mktime, strptimelogger = Logger.loggerclass Event: EVENT_TYPE = 'event' EVENT_ATTRS = ["type", "date", "sensor", "interface", "plugin_id", "plugin_sid", "priority", "protocol", "src_ip", "src_port", "dst_ip" "dst_port", "username", "password", "filename", "userdata1", "userdata2", "userdata3", "userdata4", "userdata5", "userdata6", "userdata7", "userdata8", "userdata9", "occurrences", "log", "data" "snort_sid", # snort specific "snort_cid", # snort specific "fdate", "tzone"] def _ _ init__ (self): self.event = {} self.event ["event_type"] = self.EVENT_TYPE def _ setitem__ (self, key) Value): if key in self.EVENT_ATTRS: self.event [key] = self.sanitize_value (value) if key = = "date": # self.event ["fdate"] = self.event.key] try: self.event ["date"] = int (mktime (strptime (self.event.event [k ey]) "% Y-%m-%d% H:%M:%S")) except: logger.warning ("There was an error parsing date (% s)"%\ (self.event[ key]) elif key! = 'event_type': logger.warning ("Bad event attribute:% s"% (key)) def _ _ getitem__ (self) Key): return self.event.get (key, None) # event indicates def _ _ repr__ (self): event = self.EVENT_TYPE for attr in self.EVENT_ATTRS: if self [attr]: event + ='% s = "% s"'% (attr) Self [attr]) return event + "\ n" # returns the internal hash value def dict (self): return self.event def sanitize_value (self, string): return str (string). Strip (). Replace ("\", "\\") .replace ("'", ") class EventOS (Event): EVENT_TYPE = 'host-os-event' EVENT_ATTRS = [" host " "os", "sensor", "interface", "date", "plugin_id", "plugin_sid", "occurrences", "log", "fdate",] class EventMac (Event): EVENT_TYPE = 'host-mac-event' EVENT_ATTRS = ["host", "mac", "vendor" "sensor", "interface", "date", "plugin_id", "plugin_sid", "occurrences", "log", "fdate",] class EventService (Event): EVENT_TYPE = 'host-service-event' EVENT_ATTRS = ["host", "sensor", "interface", "port" "protocol", "service", "application", "date", "plugin_id", "plugin_sid", "occurrences", "log", "fdate",] class EventHids (Event): EVENT_TYPE = 'host-ids-event' EVENT_ATTRS = ["host", "hostname" "hids_event_type", "target", "what", "extra_data", "sensor", "date", "plugin_id", "plugin_sid", "username", "password", "filename", "userdata1", "userdata2", "userdata3", "userdata4" "userdata5", "userdata6", "userdata7", "userdata8", "userdata9", "occurrences", "log", "fdate",] class WatchRule (Event): EVENT_TYPE = 'event' EVENT_ATTRS = ["type", "date", "fdate", "sensor", "interface", "src_ip" "dst_ip", "protocol", "plugin_id", "plugin_sid", "condition", "value", "port_from", "src_port", "port_to", "dst_port", "interval", "from", "to", "absolute", "log" "userdata1", "userdata2", "userdata3", "userdata4", "userdata5", "userdata6", "userdata7", "userdata8", "userdata9", "filename", "username",] class Snort (Event): EVENT_TYPE = 'snort-event' EVENT_ATTRS = ["sensor", "interface" "gzipdata", "unziplen", "event_type", "plugin_id", "type", "occurrences"]

Log Encoding Code:

Import threading, timefrom Logger import Loggerlogger = Logger.loggerfrom Outputimport Outputimport Configimport Eventfrom Threshold import EventConsolidationfrom Stats import Statsfrom ConnPro import ServerConnProclass Detector (threading.Thread): def _ _ init__ (self, conf, plugin Conn): self._conf = conf self._plugin = plugin self.os_hash = {} self.conn = conn self.consolidation = EventConsolidation (self._conf) logger.info ("Starting detector% s (% s).%\ (self._plugin.get (" config "," name "), self._plugin.get (" config ") "plugin_id")) threading.Thread.__init__ (self) def _ event_os_cached (self, event): if isinstance (event, Event.EventOS): import string current_os = string.join (string.split (event ["os"]),'') previous_os = self.os_hash.get (event ["host"] '') if current_os = = previous_os: return True else: # failed and added to the cache self.os_hash [event ["host"]] =\ string.join (string.split (event ["os"]),'') return False def _ exclude_event (self Event): if self._plugin.has_option ("config", "exclude_sids"): exclude_sids = self._plugin.get ("config") "exclude_sids") if event ["plugin_sid"] in Config.split_sids (exclude_sids): logger.debug ("Excluding event with" +\ "plugin_id=%s and plugin_sid=%s"%\ (event ["plugin_id"]) Event ["plugin_sid"]) return True return False def _ thresholding (self): self.consolidation.process () def _ plugin_defaults (self, event): # get the default parameter if self._conf.has_section ("plugin-defaults"): # 1) date default_date_format = self._conf.get ("plugin-defaults") from the configuration file "date_format") if event ["date"] is None and default_date_format and\ 'date' in event.EVENT_ATTRS: event ["date"] = time.strftime (default_date_format Time.localtime (time.time ()) # 2) Sensor default_sensor = self._conf.get ("plugin-defaults") "sensor") if event ["sensor"] is None and default_sensor and\ 'sensor' in event.EVENT_ATTRS: event ["sensor"] = default_sensor # 3) Network Interface default_iface = self._conf.get ("plugin-defaults") "interface") if event ["interface"] is None and default_iface and\ 'interface' in event.EVENT_ATTRS: event ["interface"] = default_iface # 4) Source IP if event ["src_ip"] is None and' src_ip' in event.EVENT_ATTRS: event ["src_ip"] = event [ "sensor"] # 5) time zone default_tzone = self._conf.get ("plugin-defaults" "tzone") if event ["tzone"] is None and 'tzone' in event.EVENT_ATTRS: event ["tzone"] = default_tzone # 6) sensor,source ip and dest! = localhost if event ["sensor"] in (' 127.0.0.1' '127.0.1.1'): event ["sensor"] = default_sensor if event ["dst_ip"] in ('127.0.0.1', '127.0.1.1'): event ["dst_ip"] = default_sensor if event ["src_ip"] in ('127.0.0.1' '127.0.1.1'): event ["src_ip"] = default_sensor # Detection log type if event ["type"] is None and 'type' in event.EVENT_ATTRS: event ["type"] =' detector' return event def send_message (self) Event): if self._event_os_cached (event): return if self._exclude_event (event): return # uses default values for some empty attributes. Event = self._plugin_defaults (event) # check if self.conn is not None: try: self.conn.send (str (event)) except: id = self._plugin.get ("config", "plugin_id") c = ServerConnPro (self._conf) before merging Id) self.conn = c.connect (0 10) try: self.conn.send (str (event)) except: return logger.info (str (event). Rstrip () elif not self.consolidation.insert (event): Output.event (event) Stats.new_event (event) def stop (self): # self.consolidation.clear () pass# overrides def process (self): pass def run (self): self.process () class ParserSocket (Detector): def process (self): self.process () class ParserDatabase (Detector): def process (self): self.process () in subclasses

... ...

As can be seen from above, the normalization of the sensor is mainly responsible for re-coding the data fields within each LOG to generate a new complete event that may be used to send to the OSSIM server. To achieve this, the GET framework contains specific functions to transform all the functions that require BASE64 transformation. The "OSSIM message" is responsible for populating fields that do not exist in the original events generated by GET. So the plugin_id and plugin_sid mentioned above are used to indicate the log message source type and subtype, which is also a required field for generating SIEM events. For event format integrity, sometimes this field is populated with 0.0.0.0 by default when the source or destination IP cannot be confirmed.

Note: we can use the phpmyadmin tool to view OSSIM's MySQL database for this required field.

Sender Agent is responsible for the following two tasks:

Send events collected by GET and formatted by events to the OSSIM server. This task is implemented by message queues created by Event Hander and sent to message middleware. The timing diagram is shown in figure 2.

Figure 2 sequence diagram: conversion from security probe logs to OSSIM server events

2) manage the communication between the GET framework and the OSSIM server, and the communication port is TCP 40001 through two-way handshake. Normalized raw log is an important part of the standardization process. OSSIM retains the original log while normalizing the log, which can be used for log archiving, and provides a means to extract the original log from standardized events.

The normalized EVENTS is stored in the MySQL database, as shown in figure 3. Then the association engine carries on the cross-correlation analysis according to the rules, priority, reliability and other parameters, obtains the risk value and sends out all kinds of alarm messages.

Figure 3 Log storage mechanism for OSSIM platform

Next, let's look at an example. Here is a raw log of Apache, CiscoASA, and SSH, as shown in figures 4, 5, and 6.

Regular expressions in the Apache plug-in:

[0001-apache-access] access log event _ type=eventregexp= ((? P\ d {1pr 3}\.\ d {1J 3}\.\ d {1J 3}\.\ d {1J 3}) (? P\ S+) (? P\ S+)\ [(? P\ d {2}\ / w {3}\ /\ d {4}:\ d {2}:\ d {2}: \ d {2})\ s + [+ -]\ d {4}\]\ "(? P [^\"] *)\ "(? P\ d {3}) ((? P\ d +) | -) (\" (? P [^\ "] *)\" (? P [^\ "] *)\")? $src_ip= {resolv ($src)} dst_ip= {resolv ($dst)} dst_port= {$port} date= {normalize_date ($date) } plugin_sid= {$code} username= {$user} userdata1= {$request} userdata2= {$size} userdata3= {$referer_uri} userdata4= {$useragent} filename= {$id}

[0002-apache-error] error log

Event_type=eventregexp=\ [(? P\ w {3}\ w {2}\ d {2}:\ d {2}:\ d {2}\ d {4})\]\ [(? P (emerg | alert | crit | warn | notice | info | debug))\] (\ [client (? P\ S+)\])? (? P.K.) date= {normalize_date ($date)} plugin_sid= {translate ($type)} src_ip= {resolv ($src)} userdata1= {$data}

Figure 4 Apache raw log

Figure 5 A Cisco ASA raw log

Figure 6 Cisco ASA event classification

After the normalization of OSSIM, the actual format is shown to you through the front end of Web. We will also explain how to compare the normalized events with the original log in the book "Open Source Security Operation and maintenance platform OSSIM troubleshooting: getting started". In the example shown in figure 7, only Userdata1 and Userdata2 are used, but Userdata3~Userdata9 is not used. These are extension bits, mainly reserved for use by other devices or services, where the destination address is marked as an IP address, such as Host192.168.11.160. In fact, normalization processing occurs after the system collects and stores events, and before association and data analysis. In the SIEM tool, it is easier to understand the data in the collection process by converting the data into an easy-to-read format.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Servers

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report