Skip to content

Problem with comparison of dates. S3 plugin #172

@ricardostocco

Description

@ricardostocco

When configuring the input s3 plugin I have a problem with comparing dates, since the last trace is re-processed, even though its update mark has not changed in the bucket.

I am NOT eliminating the files that are processed.

input {
s3 {
access_key_id => "xxxxx"
secret_access_key => "xxxx"
bucket => "xxxxx"
endpoint => "http://xxxxx.xxxxx.com"
region => "xxxxx"
proxy_uri => "http:/xxxx:xxxxx@xxxxxx:3128"
}
}

Logstash version: 7.0.1
SO: MS Windows 7
The bucket is not AWS S3, I'm using Open Cloud OBS (S3 compatible)

Example. In the file sincedb_xxx I have this date:
2019-05-20 23:45:10 +0000

and logstash reprocesses every 60 seconds a file that has an update date "2019-05-20 23:45:10 UTC"

It seems that the comparison "sincedb.newer? (Log.last_modified)" returns true when you compare those values ..

If I modify the comparison and compare integers, and it seems to work correctly ... (file s3.rb)

       def newer? (date)
         date.to_i> read.to_i
       end

Why is the comparison of dates wrong? Will the timezone be the problem? ("+0000" vs. "UTC")

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions