log mining

21
Dig insight LOG MINING [email protected] https://github.com/tcz001

Upload: fan-jiang

Post on 15-Jul-2015

223 views

Category:

Software


7 download

TRANSCRIPT

Page 1: Log mining

D i g i n s i g h t

LOG MINING

[email protected]

https://github.com/tcz001

Page 2: Log mining

TECH RADAR TREND

2

structured-logging

Page 3: Log mining

什么是LOG?

3

> tail -f /usr/local/log

INFO [2014-11-13 12:23:36,173] com.thoughtworks.forcetalk.resources.ContactResource: Updated Contact {"FirstName":"Alper","LastName":"Mermer","Employee_ID__c":"16906","Email":"[email protected]","Grade__c":"Senior Consultant”}ERROR [2014-11-13 11:45:33,892] com.thoughtworks.forcetalk.validators.ForceQueryResultsValidator: Unable to retrieve Project for Opportunity with id: 0065000000TE2evAADINFO [2014-11-13 12:23:36,505] com.thoughtworks.tetalk.resources.UserResource: Contact Update Response SObjectResponse{successful=true, id='null', errorMessage='null', errorField='null', errorCode='null'}

INFO 2014-11-13 12:23:36,173com.thoughtworks.forcetalk.resources.ContactResource

ERROR

Page 4: Log mining

什么是好LOG?

4

▫ http://juliusdavies.ca/logging/llclc.html

Best Logs:▫ Tell you exactly what happened: when, where, and how.▫ Suitable for manual, semi-automated, or automated analysis.▫ Can be analysed without having the application that produced them at hand.▫ Don't slow the system down.▫ Can be proven reliable (if used as evidence).

Avoid Logs:▫ Missing necessary information.▫ Unsuitable for grep because of redundant information.▫ Information split across more than one line (bad for grep).▫ Error reported to user, but not logged.▫ Never include any sensitive data.(for Security !).

Page 5: Log mining

DEVOPS的故事

5

> rm -rf ALL_THE_LOGS

Page 6: Log mining

DEVOPS的故事

6

We got an angry User! HELP!

Page 7: Log mining

BE REACTIVE

7

Page 8: Log mining

MONITOR IS FAR FROM “TOP”

8

Page 9: Log mining

SAVE OUR LIFE

9

?

Page 10: Log mining

SAVE OUR LIFE

10

Splunksaas

LogStashopensource

OR

Page 11: Log mining

SAVE OUR LIFE

11

Page 12: Log mining

SAVE OUR LIFE

12

WHAT TIME IS IT?130406050529/Apr/2011:07:05:26 +0000Fri, 21 Nov 1997 09:55:06 -0600Oct 11 20:21:47020805 13:51:24 110429.071055,118@4000000037c219bf2ef02e94

DATE FILTER FIXES THIS BULLSHIT

filter {date {# Turn 020805 13:51:24# Into 2002-08-05T13:51:24.000Zmysqltimestamp => "YYMMdd HH:mm:ss"

}}

Page 13: Log mining

SAVE OUR LIFE

13

> 23 INPUTS | 18 FILTERS | 40 OUTPUTS

不只是timestamp

▫ LogLevel▫ Source▫ IP=> GeoHash▫ Browser/Platform

Page 14: Log mining

SAVE OUR LIFE

14

Logstash-server

input {lumberjack {# The port to listen onport => 5043

# The paths to your ssl cert and keyssl_certificate => "./logstash.crt"ssl_key => "./logstash.key"

# Set this to whatever you want.type => "finance"

}}

filter {if [type] == "finance" {grok {match => [ "message",

"%{LOGLEVEL:loglevel}\s+\[%{TIMESTAMP_ISO8601:timestamp}\] (?<source>(\w|\.)+): (?<msg>(.*))" ]

add_tag => [ "grokked" ]}date {match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]

}}

}

output {if "_grokparsefailure" not in [tags] {stdout { codec => rubydebug }elasticsearch { host => localhost }

}}

Logstash-forwarder

"network": {"servers": [ "localhost:5043" ],

"ssl ca": "./logstash-forwarder.crt"},"files": [{"paths": ["/usr/local/finance/**/logs/*.log"

],"dead time" : "8760h","fields": { "type": "finance" }

}]

All Our Services

ElasticSearch Clusters

Page 15: Log mining

ELASTICSEARCH

15

▫ Restiful API search engine▫ Multi-cluster supported▫ Great community▫ Use it! throw things into it!

ElasticSearch+

Kibana

Page 16: Log mining

DIGGING DEEPER

16

curl -XGET 'http://localhost:9200/logstash-*/_search?pretty&search_type=count' -d '{"aggregations": {“source-aggregation”: {"terms": {"field": "source","size": 1000

}}

}}'

Try it!

Page 17: Log mining

DIGGING DEEPER

17

http://localhost:8000/

Zoomable Treemap for diging into Logs via source

By Elasticsearch aggregation API

Page 18: Log mining

LEARN FROM LOG

18

treat Log as Statistical Data

Page 19: Log mining

AUTO REACTIVE

19

Be Responsive to every Exception

Page 20: Log mining

OTHER POSSIBILITY

20

Page 21: Log mining

Q&A

Thanks~

21