Sagan - a multi-threads, high performance log analysis engine

,-._,-.    Sagan, the advanced Suricata/Snort like log analysis engine!
\/)"(\/ 
 (_o_)     Champ Clark III & The Quadrant InfoSec Team [quadrantsec.com]
 /   \/)   Copyright (C) 2009-2021 Quadrant Information Security, et al.
(|| ||) 
 oo-oo  

Join the Sagan Discord channel

Discord

Sagan Documentation

Sagan "Read The Docs! https://sagan.readthedocs.io

What is Sagan?

Sagan is an open source (GNU/GPLv2) high performance, real-time log analysis & correlation engine. It is written in C and uses a multi-threaded architecture to deliver high performance log & event analysis. The Sagan structure and Sagan rules work similarly to the Suricata & Snort IDS engine. This was intentionally done to maintain compatibility with rule management software (oinkmaster/pulledpork/etc) and allows Sagan to correlate log events with your IDS/IPS system.

Sagan can write out to databases via Suricata EVE formats and/or Unified2, it is compatible with all Snort & Suricata consoles. Sagan can write also write out JSON which can be ingested by Elasticsearch and viewed with console like Kibana, EVEbox, etc.

Sagan supports many different output formats, log normalization (via liblognorm), GeoIP detection, script execution on event and automatic firewall support via "Snortsam" (see http://www.snortsam.net).

Sagan uses the GNU "artisic style".

Sagan Features:

  • Sagan’s multi-threaded architecture allows it to use all CPUs / cores for real-time log processing.
  • Sagan's CPU and memory resources are light weight.
  • Sagan uses a similar rule syntax to Cisco’s “Snort” & Suricata which allows for easy rule management and correlation with Snort or Suricata IDS / IPS systems.
  • Sagan can store alert data in Cisco’s “Snort” native “unified2” binary data format or Suricata's JSON format for easier log-to-packet correlation.
  • Sagan is compatible with popular graphical-base security consoles like Snorby, BASE, Sguil, and EveBox.
  • Sagan can easily export data from other SIEMs via syslog.
  • Sagan can track events based on geographic locations via IP address source or destination data (e.g., identifying logins from strange geographic locations).
  • Sagan can monitor usage based on time of day (e.g., writing a rule to trigger when an administrator logs in at 3:00 AM).
  • Sagan has multiple means of parsing and extracting data through liblognorm or built in parsing rule options like parse_src_ip, parse_dst_ip, parse_port, parse_string, parse_hash (MD5, SHA1,SHA256).
  • Sagan can query custom blacklists, Bro Intel subscriptions like Critical Stack and “Bluedot”, Quadrant Information Security threat intelligence feeds by IP address, hashes (MD5, SHA1, SHA256), URLs, emails, usernames, and much more.
  • Sagan’s “client tracking” can inform you when machines start or stop logging. This helps you verify that you are getting the data you need.
  • Sagan uses “xbits” to correlate data between log events which allows Sagan to “remember” and flag events across multiple log lines and sources.
  • Sagan uses Intra-Process communications between Sagan processes to share data. Sagan can also use Redis (beta) to share data between Sagan instances within a network.
  • To help reduce “alert fatigue”, Sagan can “threshold” or only alert “after” certain criteria have been met.

Where can I get help with Sagan?

For more general Sagan information, please visit the offical Sagan web site: https://sagan.quadrantsec.com.

For Sagan documentation to assist with installation, rule writing, etc. Check out: https://sagan.readthedocs.io/en/latest/

For help & assistence, check out the Sagan mailing list. If it located at: https://groups.google.com/forum/#!forum/sagan-users. You can also ask questions on the Sagan Discord channel at https://discord.gg/n6ReCZED

If you're looking for Sagan rule sets on Github, they are located at: https://github.com/beave/sagan-rules

Owner
Quadrant Information Security
Quadrant Information Security consulting company based in Jacksonville, Fl. We operate a 24/7 SOC/MSSP and develop the Sagan Log Analysis Engine - sagan.io
Quadrant Information Security
Comments
  • External output plugin produces invalid JSON

    External output plugin produces invalid JSON

    The external output plugin sends JSON with a dangling normalize making the entire JSON sent invalid.

    Looks like https://github.com/quadrantsec/sagan/blob/d264dc3081715d7f360740fd53fded5de2ed7ff1/src/output-plugins/external.c#L192 might be at fault. I'm unsure though if Event->json_normalize should be true for rules not using the normalize; keyword.

    { "signature_id": 505234, "signature": "Authentication failure", "rev": 1, "severity": 1, "category": "unsuccessful-user", "priority": 1, "timestamp": "01-07-2022 12:32:21.859190", "drop": "false", "flow_id": 852827190, "in_iface": "", "src_ip": "192.168.10.33", "src_port": 514, "dest_ip": "192.168.10.33", "dest_port": 514, "xff": "192.168.10.33", "proto": "UDP", "syslog_facility": "local7", "syslog_level": "info", "syslog_priority": "info", "syslog_message": "[2022\/01\/07 12:32:15.228367,  3] ..\/..\/source4\/auth\/kerberos\/krb5_init_context.c:80(smb_krb5_debug_wrapper)#012  Kerberos: Failed to decrypt PA-DATA -- [email protected] (enctype aes256-cts-hmac-sha1-96) error Decrypt integrity check failed for checksum type hmac-sha1-96-aes256, key type aes256-cts-hmac-sha1-96#", "normalize":  }
    
    alert any $EXTERNAL_NET any -> $HOME_NET any (msg: "Authentication failure"; content:"Kerberos: Failed to decrypt PA-DATA -- "; program: samba; classtype: unsuccessful-user; sid: 505234; rev: 1; external: /usr/local/sagan/external.py)
    
  • Event_id - detection doesn't work

    Event_id - detection doesn't work

    Hallo, I am using Sagan's JSON input module but I pull the event ID from the message rather than from a JSON field. The event_id is at the beggining of the message field. In version 2.0.1 the event_id field is UNDEFINE. After the patch ( https://groups.google.com/g/sagan-users/c/ju-3g2vIYgE ) the result is that event_id is empty string.

    Best Regards Ivan


    [D] Data in _Sagan_Proc_Syslog (including extracted JSON) [D] ----------------------------------------------------------------------------- [D] * message: " 4727: A security-enabled global group was created. Subject: Security ID: S-1-5-21-3641769155-4107095991-1524253519-8107 Account Name: USER Account Domain: DOMAIN Logon ID: 0x4DCCB98 New Group: Security ID: S-1-5-21-3641769155-4107095991-1524253519-31261 Group Name: dddddd Group Domain: DOMAIN Attributes: SAM Account Name: dddddd SID History: - Additional Information: Privileges: -" [D] * program: "Security" [D] * host: "10.XXX.XXX.XX" [D] * level: "info" [D] * facility: "user" [D] * priority: "14" [D] * tag: "UNDEFINED" [D] * time: "20:51:58" [D] * date: "2021-01-28" [D] * src_ip : "" [D] * dst_ip : "" [D] * src_port : "0" [D] * dst_port : "0" [D] * proto : "0" [D] * ja3: "" [D] * event_id: "" [D] * md5: "" [D] * sha1: "" [D] * sha256: "" [D] * filename: "" [D] * hostname: "" [D] * url: "" [D] * username: ""

  • fails to build with GCC 10: multiple definitions of the same variable

    fails to build with GCC 10: multiple definitions of the same variable

    As documented at https://gcc.gnu.org/gcc-10/porting_to.html GCC 10 defaults to -fno-common which reveals issues with the code in sagan.

    See https://bugs.debian.org/957771 for more details.

    Workaround is to build with fcommon but better would be to cleanup the code.

  • [sagan-users] SEGV version 2.0.0

    [sagan-users] SEGV version 2.0.0

    Hello all,

    Today I tried updating to version 2.0.0 from the new github repo.

    If I start sagan from the shell it is starting up OK, but as soon as it starts sagan as daemon with systemctl start sagan I get a failure (signal=SEGV).

    If I revert to earlier versions in the old repo I can get commit c9c22a5ea0882b5954e8dba94dcbf533a75cd0c1 from 26-10 running, but commit 43cd81adafdf6686584d40e8d1bc64fb120ddba6 from 28-10 gives me the same SEGV message as the most recent version.

    I'm wondering if someone has also encountered the same issue and knows a solution.

    The OS I'm running on is OS 18.04LTS and I have the core dump attached to this message.

    Best regards,

    Stef stacktrace.txt

  • Better checking for alert type.

    Better checking for alert type.

    "alert ip" isn't valid in Sagan.

    "Valid options for this field are any, tcp, udp or icmp. In most cases, you will likely want to specify any. The protocal is determined by the parse_proto or parse_program_proto rule options."

    Sagan should check for valid type and error if invalid.

  • Rule errors with fresh build from source and rules from current repo

    Rule errors with fresh build from source and rules from current repo

    I followed the install guide at https://sagan.readthedocs.io/en/latest/install.html but got the source from quadrantsec instead of beave. It compiled and installed on Debian 10. I downloaded the updated rules from quadrantsec repo and it kept giving me errors when starting it. I downloaded the rules from beave and it started without problems.

    The errors were something about "expected 'track'" on several rules like cisco-correlation and something else.

  • sagan fails to start

    sagan fails to start

    Hi,

    I have just installed sagan(1.2.2-1) on Arch Linux(5.10.56-1-lts) , no install errors to report, but as I am going through the "Post-installation setup and testing" section from the Sagan User guide(readthedocs):

    • "To test, run sagan --debug syslog,engine as the root user. It will switch to the sagan user when ready, and remain running in the foreground. command: $ sudo sagan --debug syslog,engine output:sagan: error while loading shared libraries: libesmtp.so.6: cannot open shared object file: No such file or directory issue: sagan fails to start libesmtp version: community/libesmtp 1.1.0-1 [installed]

    libesmtp Directories: libesmtp /usr/include/auth-client.h libesmtp /usr/include/libesmtp.h libesmtp /usr/lib/ libesmtp /usr/lib/esmtp-plugins-6.2.0/ libesmtp /usr/lib/esmtp-plugins-6.2.0/sasl-crammd5.so libesmtp /usr/lib/esmtp-plugins-6.2.0/sasl-login.so libesmtp /usr/lib/esmtp-plugins-6.2.0/sasl-ntlm.so libesmtp /usr/lib/esmtp-plugins-6.2.0/sasl-plain.so libesmtp /usr/lib/libesmtp.so libesmtp /usr/lib/libesmtp.so.6.2.0

    At first sight it looks like sagan cannot find libesmtp.so.6 library. Do you have any suggestion?

    Thanks,

  • json_pcre { } range error

    json_pcre { } range error

    Example: json_pcre:".QuestionName", "/^.*\w{1,50}\.(\w+\.\w{2,10})$/";

    In the above example if you use {1,50} sagan will error with [rules.c, line 2354] Missing last '/' in json_pcre. This doesn't affect the normal pcre keyword. Also, single digits {50} work just fine but if a range is specified the error will occur.

  • Found an issue with json_meta_contains

    Found an issue with json_meta_contains

    Hello,

    Was trying to change sid:5003377 "[WINDOWS-AUTH] Suspicious network login from non-RFC1918" to leverage the json parsing.

    Given an event like this:

      "syslog-source-ip": "10.2.33.41",
      "WorkstationName": "WORKSTATION",
      "VirtualAccount": "%%1843",
      "Version": "2",
      "TransmittedServices": "-",
      "TaskValue": "12544",
      "TargetUserSid": "WINDOWS\\auser",
      "TargetUserName": "auser",
      "TargetOutboundUserName": "-",
      "TargetOutboundDomainName": "-",
      "TargetLogonId": "0xasdfasdf",
      "TargetLinkedLogonId": "0x0",
      "TargetDomainName": "WINDOWS",
      "SubjectUserSid": "NULL SID",
      "SubjectUserName": "-",
      "SubjectLogonId": "0x0",
      "SubjectDomainName": "-",
      "SourceName": "Microsoft-Windows-Security-Auditing",
      "SourceModuleType": "im_msvistalog",
      "SourceModuleName": "winlog",
      "SeverityValue": "2",
      "Severity": "INFO",
      "RestrictedAdminMode": "-",
      "ProviderGuid": "{REDACTED}",
      "ProcessName": "-",
      "ProcessId": "0x0",
      "OrgName": "redacted",
      "OpcodeValue": "0",
      "Opcode": "Info",
      "Message": "An account was successfully logged on.  Subject:  Security ID:  S-1-0-0  Account Name:  -  Account Domain:  -  Logon ID:  0x0  Logon Information:  Logon Type:  3  Restricted Admin Mode: -  Virtual Account:  No  Elevated Token:  No  Impersonation Level:  Impersonation  New Logon:  Security ID: REDACTED Account Name:  auser  Account Domain:  WINDOWS  Logon ID:  0xasdfasdf Linked Logon ID:  0x0  Network Account Name: -  Network Account Domain: -  Logon GUID:  {00000000-0000-0000-0000-000000000000}  Process Information:  Process ID:  0x0  Process Name:  -  Network Information:  Workstation Name: WORKSTATION  Source Network Address: 24.18.x.x  Source Port:  64491  Detailed Authentication Information:  Logon Process:  NtLmSsp   Authentication Package: NTLM  Transited Services: -  Package Name (NTLM only): NTLM V2  Key Length:  128  This event is generated when a logon session is created. It is generated on the computer that was accessed.  The subject fields indicate the account on the local system which requested the logon. This is most commonly a service such as the Server service, or a local process such as Winlogon.exe or Services.exe.  The logon type field indicates the kind of logon that occurred. The most common types are 2 (interactive) and 3 (network).  The New Logon fields indicate the account for whom the new logon was created, i.e. the account that was logged on.  The network fields indicate where a remote logon request originated. Workstation name is not always available and may be left blank in some cases.  The impersonation level field indicates the extent to which a process in the logon session can impersonate.  The authentication information fields provide detailed information about this specific logon request.  - Logon GUID is a unique identifier that can be used to correlate this event with a KDC event.  - Transited services indicate which intermediate services have participated in this logon request.  - Package name indicates which sub-protocol was used among the NTLM protocols.  - Key length indicates the length of the generated session key. This will be 0 if no session key was requested.",
      "LogonType": "3",
      "LogonProcessName": "NtLmSsp ",
      "LogonGuid": "{00000000-0000-0000-0000-000000000000}",
      "LmPackageName": "NTLM V2",
      "Keywords": "REDACTED",
      "KeyLength": "128",
      "IpPort": "64491",
      "IpAddress": "24.18.x.x",
      "ImpersonationLevel": "%%1833",
      "Hostname": "REDACTED",
      "ExecutionThreadID": "6024",
      "ExecutionProcessID": "752",
      "Evtid": " 4624: ",
      "EventType": "AUDIT_SUCCESS",
      "EventTime": "2021-02-26T16:31:08.165307-08:00",
      "EventReceivedTime": "2021-02-26T16:42:10.861381-08:00",
      "EventID": "4624",
      "ElevatedToken": "%%1843",
      "Channel": "Security",
      "Category": "Logon",
      "AuthenticationPackageName": "NTLM",
    
    }
    

    While working on the meta_content portion:

     meta_content:!"Source Network Address|3a| %sagan%",10.,192.168.,-,|3a 3a|1,127.0.0.1,172.16.,172.17.,172.18.,172.19.,172.20.,172.21.,172.22.,172.23.,172.24.,172.25.,172.26.,172.27.,172.28.,172.29.,172.30.,172.31,169.254,fe80,|3a 3a|1;
    

    I found a discrepancy in the application of json_meta_contains. Its necessary to use either json_pcre or json_meta_contains since we just want to match on the beginning of the key IpAddress's value.

    When I tried to use:

    json_meta_content:!"IpAddress",10.,192.168.,-,|3a 3a|1,127.0.0.1,172.16.,172.17.,172.18.,172.19.,172.20.,172.21.,172.22.,172.23.,172.24.,172.25.,172.26.,172.27.,172.28.,172.29.,172.30.,172.31,169.254,fe80,|3a 3a|1; json_meta_contains;
    

    I was unable to load the signature receiving this error:

    [E] [rules.c, line 3538] Got bad rule option 'json_meta_contains' on line 268 of /usr/local/etc/sagan-rules/windows-auth.rules. Abort.
    

    The fully modified signature is :

    alert any any any -> any any (msg: "[WINDOWS-AUTH] Suspicious network login from non-RFC1918"; program: *Security*; event_id: 4624; json_content:".LogonType","3"; parse_src_ip: 1; json_meta_
    content:!"IpAddress",10.,192.168.,-,|3a 3a|1,127.0.0.1,172.16.,172.17.,172.18.,172.19.,172.20.,172.21.,172.22.,172.23.,172.24.,172.25.,172.26.,172.27.,172.28.,172.29.,172.30.,172.31,169.254,
    fe80,|3a 3a|1; json_meta_contains; json_meta_nocase;reference: url,findingbad.blogspot.cz/2017/12/a-few-of-my-favorite-things-continued.html; reference: url,wiki.quadrantsec.com/bin/view/Mai
    n/5003377; reference: url,www.quadrantsec.com/about/blog/using_jack_crooks_log_analysis_concepts_with_sagan; sid:5003377; classtype:suspicious-login; rev:6;)
    

    When I looked at the source code pertaining to the json_meta_contains declaration:

                        /* Set the previous "json_meta_strstr" to use strstr instead of strcmp */
    
                        /* TODO: Remove "json_meta_strstr" */
    
                        if (!strcmp(rulesplit, "json_meta_strstr") || !strcmp(rulesplit, "json_meta_contains") )
                            {
                                strtok_r(NULL, ":", &saveptrrule2);
                                rulestruct[counters->rulecount].json_meta_strstr[json_content_count-1] = 1;
                            }
    

    Shouldnt the use of "json_content_count-1" actually be "json_meta_content_count-1" ?

    Similiar to the way the json_meta_nocase modifier is coded.

    
                        if (!strcmp(rulesplit, "json_meta_nocase"))
                            {
                                strtok_r(NULL, ":", &saveptrrule2);
                                rulestruct[counters->rulecount].json_meta_content_case[json_meta_content_count-1] = true;
                            }
    
    
  • -L command line option log directory override

    -L command line option log directory override

    There is a -l option to send the sagan.log file to a separate location, bypassing the config file. Can the -l be changed to redirect all logs to the new location or use -L for that?

  • Ability to read in gzip compressed files.

    Ability to read in gzip compressed files.

    Sagan can read in files on the disk. It would be great if Sagan could automatically read in gzip compressed files. libz (??) has gzfopen() and a similar API for file calls.. Might not be hard to do.

  • json_content SegFault without json_parse_data

    json_content SegFault without json_parse_data

    Rules using json_content without json_parse_data: enabled in the yaml file will produce a SegFault. This only seems to happen once Sagan see json logs. If json_parse_data is enabled then there is no SegFault.

    [Switching to Thread 0x7ffdd4bf6700 (LWP 35603)]
    0x000055555556dc52 in JSON_Content ([email protected]=0, [email protected]=0x0)
        at json-content.c:55
    55      json-content.c: No such file or directory.
    (gdb) bt
    #0  0x000055555556dc52 in JSON_Content ([email protected]=0, [email protected]=0x0)
        at json-content.c:55
    #1  0x0000555555585f2c in Sagan_Engine ([email protected]=0x7ffd98000b60,
        [email protected]=0x0, dynamic_rule_flag=false) at processors/engine.c:656
    #2  0x000055555556d63d in Processor () at processor.c:224
    #3  0x00007ffff7aafea7 in start_thread (arg=<optimized out>) at pthread_create.c:477
    #4  0x00007ffff79dfdef in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:95
    
  • meta_content and json_meta_content modifier

    meta_content and json_meta_content modifier

    The meta_content and json_meta_content both accept multiple multiple value can are compared based on OR logic, meaning only one needs to match to make the statement true. What about a & modifier to change them to use AND logic and have to match all value? This of course can be done with multiple json_content statements but may look cleaner for certain usages.

    This can also be accomplished with json_pcre but may be faster than using pcre?

    Current Usage: json_content:".SourceImage", "|5c|AppData|5c|Local|5c|"; json_content:".SourceImage","|2e|exe"; distance:0; within:100;

    Proposed Usage: json_meta_content:&".SourceImage",|5c|AppData|5c|Local|5c|, |2e|exe;

    The proposed usage can even be processed in order. So in this example it would have to match |5c|AppData|5c|Local|5c| first then |2e|exe would be next and have to be found after the first match. This may need an additional modifier like json_meta_ordered;

    json_meta_content:&".SourceImage",|5c|AppData|5c|Local|5c|, |2e|exe; json_meta_ordered;

  • Sagan loads rules with

    Sagan loads rules with "threshold" written incorrectly

    Observed behavior

    Two rules in the https://github.com/quadrantsec/sagan-rules repo are written incorrect, possibly in an older format:

    alert any $EXTERNAL_NET any -> $HOME_NET any (msg:"[ASTERISK] Login session failed [0/5]"; content: "Wrong password"; classtype: unsuccessful-user; program: asterisk; reference: url,wiki.quadrantsec.com/bin/view/Main/5000179; threshold:suppress, track by_src, count 5, seconds 900; sid:5000179; rev:4;)
    
    alert any $EXTERNAL_NET any -> $HOME_NET any (msg:"[ASTERISK] Brute force login session failed [5/5]"; content: "Wrong password"; xbits: set, brute_force ,track ip_src, expire 21600; classtype: brute-force; program: asterisk; reference: url,wiki.quadrantsec.com/bin/view/Main/5002942; after: track by_src, count 5, seconds 300; threshold:suppress, track by_src, count 5, seconds 900; sid:5002942; rev:4;)
    

    Sagan successfully runs when loading in these rules, but the thresholds are not applied properly. This results in no threshold being applied. Changing to the documented way of threshold: type suppress, ... enables the rule to perform as expected.

    Expected behavior

    Sagan fails to load due to the rule being written incorrectly

    Sagan version and OS version

    Sagan v2.1.0 Debian 11

  • after and threshold tracking keys from json_map/normalization

    after and threshold tracking keys from json_map/normalization

    It could be useful to grab the key used in the json_map field and be able to track by whatever the user defined in json_map. Here's an example:

    alert any $HOME_NET any -> $HOME_NET any (msg:"[WINDOWS-AUTH] Domain Controller Blocked Audit: Audit NTLM authentication to this domain controller [100/1]"; \
    program:Microsoft-Windows-NTLM/Operational; \
    json_map:"event_id",".EventID"; 
    event_id:8004; \
    json_map:"host.workstationName",".WorkstationName"; \
    json_map:"host.username",".Hostname"; \
    after: track by_host.workstationName&by_host.username, count 10, seconds 60; \
    threshold: type suppress, track by_host.workstationName&by_host.username, count 1, seconds 60; \
    classtype:attempted-user; sid:1111111; rev:1;)
    

    In the example we can use the predetermined fields, like event_id, or a user defined variable. This give the ability to track by two or more strings with after and/or threshold. If this can be done, maybe the same can be done with liblognorm. Any fields that are defined using liblognorm can be tracked with after and/or threshold.

  • strip character transformation

    strip character transformation

    Similar to https://suricata.readthedocs.io/en/suricata-6.0.6/rules/transforms.html?highlight=transform

    Since some windows logs may contain tab characters and some don't it could be handy to have the ability to remove them and somewhat normalize the logs. This would strip all matches to the character values either in hex or ascii format.

    strip_char: { [hex values|ascii values], [...] } ;

    Ex. strip_char: |0d 0a|; remove return newline strip_char: |09|; remove tab strip_char: "\t"; `strip_char: |20|, |60|, ^; remove spaces and back ticks and carrots.

    The back ticks and carrots can be used to obfuscate commands and having them removed can help with detections.

    --debug transform Command line option Being able to see how the log looks after the transformation can help with rule writing if something isn't working.

Related tags
Mini-async-log-c - Mini async log C port. Now with C++ wrappers.

Description A C11/C++11 low-latency wait-free producer (when using Thread Local Storage) asynchronous textual data logger with type-safe strings. Base

Nov 9, 2022
Log engine for c plus plus
Log engine for c plus plus

PTCLogs library PTCLogs is a library for pretty and configurable logs. Installation To install the library (headers and .so file), clone this repo and

May 20, 2022
Minimalistic logging library with threads and manual callstacks

Minimalistic logging library with threads and manual callstacks

Dec 5, 2022
log4cplus is a simple to use C++ logging API providing thread-safe, flexible, and arbitrarily granular control over log management and configuration. It is modelled after the Java log4j API.

% log4cplus README Short Description log4cplus is a simple to use C++17 logging API providing thread--safe, flexible, and arbitrarily granular control

Jan 4, 2023
Example program using eBPF to log data being based in using shell pipes

Example program using eBPF to log data being based in using shell pipes (|)

Oct 21, 2022
A revised version of NanoLog which writes human readable log file, and is easier to use.
A revised version of NanoLog which writes human readable log file, and is easier to use.

NanoLogLite NanoLogLite is a revised version of NanoLog, and is easier to use without performance compromise. The major changes are: NanoLogLite write

Nov 22, 2022
Cute Log is a C++ Library that competes to be a unique logging tool.

Cute Log Cute Log is a C++ Library that competes to be a unique logging tool. Version: 2 Installation Click "Code" on the main repo page (This one.).

Oct 13, 2022
Compressed Log Processor (CLP) is a free tool capable of compressing text logs and searching the compressed logs without decompression.

CLP Compressed Log Processor (CLP) is a tool capable of losslessly compressing text logs and searching the compressed logs without decompression. To l

Dec 30, 2022
View and log aoe-api requests and responses

aoe4_socketspy View and log aoe-api requests and responses Part 1: https://www.codereversing.com/blog/archives/420 Part 2: https://www.codereversing.c

Nov 1, 2022
Uberlog - Cross platform multi-process C++ logging system

uberlog uberlog is a cross platform C++ logging system that is: Small Fast Robust Runs on Linux, Windows, OSX MIT License Small Two headers, and three

Sep 29, 2022
logog is a portable C++ library to facilitate logging of real-time events in performance-oriented applications

logog is a portable C++ library to facilitate logging of real-time events in performance-oriented applications, such as games. It is especially appropriate for projects that have constrained memory and constrained CPU requirements.

Oct 21, 2020
Reckless logging. Low-latency, high-throughput, asynchronous logging library for C++.
Reckless logging. Low-latency, high-throughput, asynchronous logging library for C++.

Introduction Reckless is an extremely low-latency, high-throughput logging library. It was created because I needed to perform extensive diagnostic lo

Dec 20, 2022
Log.c2 is based on rxi/log.c with MIT LICENSE which is inactive now. Log.c has a very flexible and scalable architecture

log.c2 A simple logging library. Log.c2 is based on rxi/log.c with MIT LICENSE which is inactive now. Log.c has a very flexible and scalable architect

Feb 13, 2022
Inter-process communication library to enable allocation between processes/threads and send/receive of allocated regions between producers/consumer processes or threads using this ipc buffer.

This is a relatively simple IPC buffer that allows multiple processes and threads to share a dynamic heap allocator, designate "channels" between processes, and share that memory between producer/consumer pairs on those channels.

Aug 20, 2022
Mini-async-log-c - Mini async log C port. Now with C++ wrappers.

Description A C11/C++11 low-latency wait-free producer (when using Thread Local Storage) asynchronous textual data logger with type-safe strings. Base

Nov 9, 2022
Simple application log library. supporting multiple log levels, custom output & flash memory support.
Simple application log library. supporting multiple log levels, custom output  & flash memory support.

ArduinoLog - C++ Log library for Arduino devices An minimalistic Logging framework for Arduino-compatible embedded systems. ArduinoLog is a minimalist

Nov 12, 2022
Probabilistic Risk Analysis Tool (fault tree analysis, event tree analysis, etc.)

SCRAM SCRAM is a Command-line Risk Analysis Multi-tool. This project aims to build a command line tool for probabilistic risk analysis. SCRAM is capab

Dec 30, 2022
android analysis tools, jni trace by native hook, libc hook, write log with caller's addr in file or AndroidLog

编译方法 unix like mkdir "build" cd build cmake .. -DNDK=your_ndk_path/Android/sdk/ndk/22.0.7026061 -DANDROID_ABI=armeabi-v7a make -j8 或者使用andriod studio编

Dec 1, 2022
High Performance 3D Game Engine, with a high emphasis on Rendering
High Performance 3D Game Engine, with a high emphasis on Rendering

Electro High Performance 3D Game Engine, with a high emphasis on Rendering MainFeatures Rendering PBR Renderer (Cook–Torrance GGX) IBL (Image Based Li

Dec 19, 2022
HybridSE (Hybrid SQL Engine) is an LLVM-based, hybrid-execution and high-performance SQL engine
HybridSE (Hybrid SQL Engine) is an LLVM-based, hybrid-execution and high-performance SQL engine

HybridSE (Hybrid SQL Engine) is an LLVM-based, hybrid-execution and high-performance SQL engine. It can provide fast and consistent execution on heterogeneous SQL data systems, e.g., OLAD database, HTAP system, SparkSQL, and Flink Stream SQL.

Sep 12, 2021