How to count occurrences of text in a file? The 2019 Stack Overflow Developer Survey Results Are InCount duplicated words in a text fileuniq --count command is yields incorrect result?How to compare two (vague) file lists and print the duplicates?How to count occurrences of each character?How do I count text lines?How long it will take to sort uniq a 62GB file?How does uniq work?Count number of data points in fileWrong sorting a text fileHow to count number of partial occurrences of a string in a file

What could be the right powersource for 15 seconds lifespan disposable giant chainsaw?

How can I make payments on the Internet without leaving a money trail?

How do you say "canon" as in "official for a story universe"?

I looked up a future colleague on LinkedIn before I started a job. I told my colleague about it and he seemed surprised. Should I apologize?

Time travel alters history but people keep saying nothing's changed

What can other administrators access on my machine?

How long do I have to send my income tax payment to the IRS?

Why do some words that are not inflected have an umlaut?

Is there a general name for the setup in which payoffs are not known exactly but players try to influence each other's perception of the payoffs?

Find number from a line and get the quotient

Does duplicating a spell with Wish count as casting that spell?

Why is the maximum length of OpenWrt’s root password 8 characters?

Monty Hall variation

Extreme, unacceptable situation and I can't attend work tomorrow morning

How was Skylab's orbit inclination chosen?

What is the meaning of Triage in Cybersec world?

If the Wish spell is used to duplicate the effect of Simulacrum, are existing duplicates destroyed?

How to implement Time Picker in Magento 2 Admin system.xml?

What is this 4-propeller plane?

Should I use my personal e-mail address, or my workplace one, when registering to external websites for work purposes?

What tool would a Roman-age civilization have to grind silver and other metals into dust?

Pristine Bit Checking

Geography at the pixel level

How to reverse every other sublist of a list?



How to count occurrences of text in a file?



The 2019 Stack Overflow Developer Survey Results Are InCount duplicated words in a text fileuniq --count command is yields incorrect result?How to compare two (vague) file lists and print the duplicates?How to count occurrences of each character?How do I count text lines?How long it will take to sort uniq a 62GB file?How does uniq work?Count number of data points in fileWrong sorting a text fileHow to count number of partial occurrences of a string in a file



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








19

















I have a log file sorted by IP addresses,
I want to find the number of occurrences of each unique IP address.
How can I do this with bash? Possibly listing the number of occurrences next to an ip, such as:



5.135.134.16 count: 5
13.57.220.172: count 30
18.206.226 count:2


and so on.



Here’s a sample of the log:



5.135.134.16 - - [23/Mar/2019:08:42:54 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "POST /wp-login.php HTTP/1.1" 200 3836 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "POST /wp-login.php HTTP/1.1" 200 3988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:56 -0400] "POST /xmlrpc.php HTTP/1.1" 200 413 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:05 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:06 -0400] "POST /wp-login.php HTTP/1.1" 200 3985 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:07 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:08 -0400] "POST /wp-login.php HTTP/1.1" 200 3833 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:09 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:11 -0400] "POST /wp-login.php HTTP/1.1" 200 3836 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:12 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:15 -0400] "POST /wp-login.php HTTP/1.1" 200 3837 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:17 -0400] "POST /xmlrpc.php HTTP/1.1" 200 413 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.233.99 - - [23/Mar/2019:04:17:45 -0400] "GET / HTTP/1.1" 200 25160 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36"
18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "https://www.google.com/url?3a622303df89920683e4421b2cf28977" "Mozilla/5.0 (Windows NT 6.2; rv:33.0) Gecko/20100101 Firefox/33.0"
18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] "POST /wp-login.php HTTP/1.1" 200 3988 "https://www.google.com/url?3a622303df89920683e4421b2cf28977" "Mozilla/5.0 (Windows NT 6.2; rv:33.0) Gecko/20100101 Firefox/33.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"









share|improve this question



















  • 1





    With “bash”, do you mean the plain shell or the command line in general?

    – dessert
    Mar 28 at 21:55






  • 1





    Do you have any database software available to use?

    – SpacePhoenix
    Mar 29 at 8:58






  • 1





    Related

    – Julien Lopez
    Mar 30 at 0:17












  • The log is from an appache2 server, not really a database. bash is what I would prefer, in a general use case. I see the python and perl solutions, if they are good for someone else, that is great. the initial sorting was done with sort -V though I think that wasn't required. I sent the top 10 abusers of the login page to the system admin with recommendations for banning respective subnets. for example, One IP hit the login page over 9000 times. that IP, & its class D subnet is now blacklisted. I'm sure we could automate this, though that is a different question.

    – j0h
    Mar 31 at 19:36

















19

















I have a log file sorted by IP addresses,
I want to find the number of occurrences of each unique IP address.
How can I do this with bash? Possibly listing the number of occurrences next to an ip, such as:



5.135.134.16 count: 5
13.57.220.172: count 30
18.206.226 count:2


and so on.



Here’s a sample of the log:



5.135.134.16 - - [23/Mar/2019:08:42:54 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "POST /wp-login.php HTTP/1.1" 200 3836 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "POST /wp-login.php HTTP/1.1" 200 3988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:56 -0400] "POST /xmlrpc.php HTTP/1.1" 200 413 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:05 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:06 -0400] "POST /wp-login.php HTTP/1.1" 200 3985 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:07 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:08 -0400] "POST /wp-login.php HTTP/1.1" 200 3833 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:09 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:11 -0400] "POST /wp-login.php HTTP/1.1" 200 3836 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:12 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:15 -0400] "POST /wp-login.php HTTP/1.1" 200 3837 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:17 -0400] "POST /xmlrpc.php HTTP/1.1" 200 413 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.233.99 - - [23/Mar/2019:04:17:45 -0400] "GET / HTTP/1.1" 200 25160 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36"
18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "https://www.google.com/url?3a622303df89920683e4421b2cf28977" "Mozilla/5.0 (Windows NT 6.2; rv:33.0) Gecko/20100101 Firefox/33.0"
18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] "POST /wp-login.php HTTP/1.1" 200 3988 "https://www.google.com/url?3a622303df89920683e4421b2cf28977" "Mozilla/5.0 (Windows NT 6.2; rv:33.0) Gecko/20100101 Firefox/33.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"









share|improve this question



















  • 1





    With “bash”, do you mean the plain shell or the command line in general?

    – dessert
    Mar 28 at 21:55






  • 1





    Do you have any database software available to use?

    – SpacePhoenix
    Mar 29 at 8:58






  • 1





    Related

    – Julien Lopez
    Mar 30 at 0:17












  • The log is from an appache2 server, not really a database. bash is what I would prefer, in a general use case. I see the python and perl solutions, if they are good for someone else, that is great. the initial sorting was done with sort -V though I think that wasn't required. I sent the top 10 abusers of the login page to the system admin with recommendations for banning respective subnets. for example, One IP hit the login page over 9000 times. that IP, & its class D subnet is now blacklisted. I'm sure we could automate this, though that is a different question.

    – j0h
    Mar 31 at 19:36













19












19








19


7








I have a log file sorted by IP addresses,
I want to find the number of occurrences of each unique IP address.
How can I do this with bash? Possibly listing the number of occurrences next to an ip, such as:



5.135.134.16 count: 5
13.57.220.172: count 30
18.206.226 count:2


and so on.



Here’s a sample of the log:



5.135.134.16 - - [23/Mar/2019:08:42:54 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "POST /wp-login.php HTTP/1.1" 200 3836 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "POST /wp-login.php HTTP/1.1" 200 3988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:56 -0400] "POST /xmlrpc.php HTTP/1.1" 200 413 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:05 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:06 -0400] "POST /wp-login.php HTTP/1.1" 200 3985 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:07 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:08 -0400] "POST /wp-login.php HTTP/1.1" 200 3833 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:09 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:11 -0400] "POST /wp-login.php HTTP/1.1" 200 3836 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:12 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:15 -0400] "POST /wp-login.php HTTP/1.1" 200 3837 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:17 -0400] "POST /xmlrpc.php HTTP/1.1" 200 413 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.233.99 - - [23/Mar/2019:04:17:45 -0400] "GET / HTTP/1.1" 200 25160 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36"
18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "https://www.google.com/url?3a622303df89920683e4421b2cf28977" "Mozilla/5.0 (Windows NT 6.2; rv:33.0) Gecko/20100101 Firefox/33.0"
18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] "POST /wp-login.php HTTP/1.1" 200 3988 "https://www.google.com/url?3a622303df89920683e4421b2cf28977" "Mozilla/5.0 (Windows NT 6.2; rv:33.0) Gecko/20100101 Firefox/33.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"









share|improve this question


















I have a log file sorted by IP addresses,
I want to find the number of occurrences of each unique IP address.
How can I do this with bash? Possibly listing the number of occurrences next to an ip, such as:



5.135.134.16 count: 5
13.57.220.172: count 30
18.206.226 count:2


and so on.



Here’s a sample of the log:



5.135.134.16 - - [23/Mar/2019:08:42:54 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "POST /wp-login.php HTTP/1.1" 200 3836 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:55 -0400] "POST /wp-login.php HTTP/1.1" 200 3988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
5.135.134.16 - - [23/Mar/2019:08:42:56 -0400] "POST /xmlrpc.php HTTP/1.1" 200 413 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:05 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:06 -0400] "POST /wp-login.php HTTP/1.1" 200 3985 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:07 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:08 -0400] "POST /wp-login.php HTTP/1.1" 200 3833 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:09 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:11 -0400] "POST /wp-login.php HTTP/1.1" 200 3836 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:12 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:15 -0400] "POST /wp-login.php HTTP/1.1" 200 3837 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.220.172 - - [23/Mar/2019:11:01:17 -0400] "POST /xmlrpc.php HTTP/1.1" 200 413 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
13.57.233.99 - - [23/Mar/2019:04:17:45 -0400] "GET / HTTP/1.1" 200 25160 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36"
18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "https://www.google.com/url?3a622303df89920683e4421b2cf28977" "Mozilla/5.0 (Windows NT 6.2; rv:33.0) Gecko/20100101 Firefox/33.0"
18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] "POST /wp-login.php HTTP/1.1" 200 3988 "https://www.google.com/url?3a622303df89920683e4421b2cf28977" "Mozilla/5.0 (Windows NT 6.2; rv:33.0) Gecko/20100101 Firefox/33.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"
18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] "GET /wp-login.php HTTP/1.1" 200 2988 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:62.0) Gecko/20100101 Firefox/62.0"






command-line bash sort uniq






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 28 at 22:25









dessert

25.4k674108




25.4k674108










asked Mar 28 at 21:51









j0hj0h

6,5821657121




6,5821657121







  • 1





    With “bash”, do you mean the plain shell or the command line in general?

    – dessert
    Mar 28 at 21:55






  • 1





    Do you have any database software available to use?

    – SpacePhoenix
    Mar 29 at 8:58






  • 1





    Related

    – Julien Lopez
    Mar 30 at 0:17












  • The log is from an appache2 server, not really a database. bash is what I would prefer, in a general use case. I see the python and perl solutions, if they are good for someone else, that is great. the initial sorting was done with sort -V though I think that wasn't required. I sent the top 10 abusers of the login page to the system admin with recommendations for banning respective subnets. for example, One IP hit the login page over 9000 times. that IP, & its class D subnet is now blacklisted. I'm sure we could automate this, though that is a different question.

    – j0h
    Mar 31 at 19:36












  • 1





    With “bash”, do you mean the plain shell or the command line in general?

    – dessert
    Mar 28 at 21:55






  • 1





    Do you have any database software available to use?

    – SpacePhoenix
    Mar 29 at 8:58






  • 1





    Related

    – Julien Lopez
    Mar 30 at 0:17












  • The log is from an appache2 server, not really a database. bash is what I would prefer, in a general use case. I see the python and perl solutions, if they are good for someone else, that is great. the initial sorting was done with sort -V though I think that wasn't required. I sent the top 10 abusers of the login page to the system admin with recommendations for banning respective subnets. for example, One IP hit the login page over 9000 times. that IP, & its class D subnet is now blacklisted. I'm sure we could automate this, though that is a different question.

    – j0h
    Mar 31 at 19:36







1




1





With “bash”, do you mean the plain shell or the command line in general?

– dessert
Mar 28 at 21:55





With “bash”, do you mean the plain shell or the command line in general?

– dessert
Mar 28 at 21:55




1




1





Do you have any database software available to use?

– SpacePhoenix
Mar 29 at 8:58





Do you have any database software available to use?

– SpacePhoenix
Mar 29 at 8:58




1




1





Related

– Julien Lopez
Mar 30 at 0:17






Related

– Julien Lopez
Mar 30 at 0:17














The log is from an appache2 server, not really a database. bash is what I would prefer, in a general use case. I see the python and perl solutions, if they are good for someone else, that is great. the initial sorting was done with sort -V though I think that wasn't required. I sent the top 10 abusers of the login page to the system admin with recommendations for banning respective subnets. for example, One IP hit the login page over 9000 times. that IP, & its class D subnet is now blacklisted. I'm sure we could automate this, though that is a different question.

– j0h
Mar 31 at 19:36





The log is from an appache2 server, not really a database. bash is what I would prefer, in a general use case. I see the python and perl solutions, if they are good for someone else, that is great. the initial sorting was done with sort -V though I think that wasn't required. I sent the top 10 abusers of the login page to the system admin with recommendations for banning respective subnets. for example, One IP hit the login page over 9000 times. that IP, & its class D subnet is now blacklisted. I'm sure we could automate this, though that is a different question.

– j0h
Mar 31 at 19:36










8 Answers
8






active

oldest

votes


















13














You can use grep and uniq for the list of addresses, loop over them and grep again for the count:



for i in $(<log grep -o '^[^ ]*' | uniq); do
printf '%s count %dn' "$i" $(<log grep -c "$i")
done


grep -o '^[^ ]*' outputs every character from the beginning (^) until the first space of each line, uniq removes repeated lines, thus leaving you with a list of IP addresses. Thanks to command substitution, the for loop loops over this list printing the currently processed IP followed by “ count ” and the count. The latter is computed by grep -c, which counts the number of lines with at least one match.



Example run



$ for i in $(<log grep -o '^[^ ]*'|uniq);do printf '%s count %dn' "$i" $(<log grep -c "$i");done
5.135.134.16 count 5
13.57.220.172 count 9
13.57.233.99 count 1
18.206.226.75 count 2
18.213.10.181 count 3





share|improve this answer




















  • 12





    This solution iterates over the input file repeatedly, once for each IP address, which will be very slow if the file is large. The other solutions using uniq -c or awk only need to read the file once,

    – David
    Mar 29 at 1:56






  • 1





    @David this is true, but this would have been my first go at it as well, knowing that grep counts. Unless performance is measurably a problem... dont prematurely optimize?

    – D. Ben Knoble
    Mar 29 at 3:56






  • 3





    I would not call it a premature optimization, given that the more efficient solution is also simpler, but to each their own.

    – David
    Mar 29 at 5:26











  • By the way, why is it written as <log grep ... and not grep ... log?

    – Santiago
    Apr 3 at 17:09











  • @Santiago Because that’s better in many ways, as Stéphane Chazelas explains here on U&L.

    – dessert
    Apr 3 at 17:28


















37














You can use cut and uniq tools:



cut -d ' ' -f1 test.txt | uniq -c
5 5.135.134.16
9 13.57.220.172
1 13.57.233.99
2 18.206.226.75
3 18.213.10.181


Explanation :




  • cut -d ' ' -f1 : extract first field (ip address)


  • uniq -c : report repeated lines and display the number of occurences





share|improve this answer




















  • 6





    One could use sed, e.g. sed -E 's/ *(S*) *(S*)/2 count: 1/' to get the output exactly like OP wanted.

    – dessert
    Mar 28 at 22:22






  • 2





    This should be the accepted answer, as the one by dessert needs to read the file repeatedly so is much slower. And you can easily use sort file | cut .... in case you're not sure if the file is already sorted.

    – Guntram Blohm
    Mar 29 at 8:44


















13














If you don't specifically require the given output format, then I would recommend the already posted cut + uniq based answer



If you really need the given output format, a single-pass way to do it in Awk would be



awk 'c[$1]++ ENDfor(i in c) print i, "count: " c[i]' log


This is somewhat non-ideal when the input is already sorted since it unnecessarily stores all the IPs into memory - a better, though more complicated, way to do it in the pre-sorted case (more directly equivalent to uniq -c) would be:



awk '
NR==1 last=$1
$1 != last print last, "count: " c[last]; last = $1
c[$1]++
END print last, "count: " c[last]
'


Ex.



$ awk 'NR==1 last=$1 $1 != last print last, "count: " c[last]; last = $1 c[$1]++ ENDprint last, "count: " c[last]' log
5.135.134.16 count: 5
13.57.220.172 count: 9
13.57.233.99 count: 1
18.206.226.75 count: 2
18.213.10.181 count: 3





share|improve this answer

























  • it would be easy to change the cut + uniq based answer with sed to appear in the demanded format.

    – Peter A. Schneider
    Mar 29 at 11:12











  • @PeterA.Schneider yes it would - I believe that was already pointed out in comments to that answer

    – steeldriver
    Mar 29 at 12:07












  • Ah, yes, I see.

    – Peter A. Schneider
    Mar 29 at 12:36


















8














Here is one possible solution:





IN_FILE="file.log"
for IP in $(awk 'print $1' "$IN_FILE" | sort -u)
do
echo -en "$IPtcount: "
grep -c "$IP" "$IN_FILE"
done


  • replace file.log with the actual file name.

  • the command substitution expression $(awk 'print $1' "$IN_FILE" | sort -u) will provide a list of the unique values of the first column.

  • then grep -c will count each of these values within the file.


$ IN_FILE="file.log"; for IP in $(awk 'print $1' "$IN_FILE" | sort -u); do echo -en "$IPtcount: "; grep -c "$IP" "$IN_FILE"; done
13.57.220.172 count: 9
13.57.233.99 count: 1
18.206.226.75 count: 2
18.213.10.181 count: 3
5.135.134.16 count: 5





share|improve this answer




















  • 1





    Prefer printf...

    – D. Ben Knoble
    Mar 29 at 3:58






  • 1





    This means you need to process the entire file multiple times. Once to get the list of IPs and then once more for each of the IPs you find.

    – terdon
    Mar 29 at 16:07


















5














Some Perl:



$ perl -lae '$k$F[0]++; } print "$_ count: $k$_" for keys(%k)' log 
13.57.233.99 count: 1
18.206.226.75 count: 2
13.57.220.172 count: 9
5.135.134.16 count: 5
18.213.10.181 count: 3


This is the same idea as Steeldriver's awk approach, but in Perl. The -a causes perl to automatically split each input line into the array @F, whose first element (the IP) is $F[0]. So, $k$F[0]++ will create the hash %k, whose keys are the IPs and whose values are the number of times each IP was seen. The awk 'printf "%s count: %dn", $2,$1 '

5.135.134.16 count: 5
13.57.220.172 count: 9
13.57.233.99 count: 1
18.206.226.75 count: 2
18.213.10.181 count: 3


Note that uniq won't detect repeated lines in the input file if they are not adjacent, so it may be necessary to sort the file.






share print "$_ count: $k$_" for keys(%k)' log
13.57.233.99 count: 1
18.206.226.75 count: 2
13.57.220.172 count: 9
5.135.134.16 count: 5
18.213.10.181 count: 3


This is the same idea as Steeldriver's awk approach, but in Perl. The -a causes perl to automatically split each input line into the array @F, whose first element (the IP) is $F[0]. So, $k$F[0]++ will create the hash %k, whose keys are the IPs and whose values are the number of times each IP was seen. The  













5














Some Perl:



$ perl -lae '$k$F[0]++; print "$_ count: $k$_" for keys(%k)' log 
13.57.233.99 count: 1
18.206.226.75 count: 2
13.57.220.172 count: 9
5.135.134.16 count: 5
18.213.10.181 count: 3


This is the same idea as Steeldriver's awk approach, but in Perl. The -a causes perl to automatically split each input line into the array @F, whose first element (the IP) is $F[0]. So, $k$F[0]++ will create the hash %k, whose keys are the IPs and whose values are the number of times each IP was seen. The  









5












5








5







Some Perl:



$ perl -lae '$k$F[0]++; print "$_ count: $k$_" for keys(%k)' log 
13.57.233.99 count: 1
18.206.226.75 count: 2
13.57.220.172 count: 9
5.135.134.16 count: 5
18.213.10.181 count: 3


This is the same idea as Steeldriver's awk approach, but in Perl. The -a causes perl to automatically split each input line into the array @F, whose first element (the IP) is $F[0]. So, $k$F[0]++ will create the hash %k, whose keys are the IPs and whose values are the number of times each IP was seen. The is funky perlspeak for "do the rest at the very end, after processing all input". So, at the end, the script will iterate over the keys of the hash and print the current key ($_) along with its value ($k$_).



And, just so people don't think that perl forces you to write script that look like cryptic scribblings, this is the same thing in a less condensed form:



perl -e '
while (my $line=<STDIN>)
@fields = split(/ /, $line);
$ip = $fields[0];
$counts$ip++;

foreach $ip (keys(%counts))
print "$ip count: $counts$ipn"
' < log





share print "$_ count: $k$_" for keys(%k)' log
13.57.233.99 count: 1
18.206.226.75 count: 2
13.57.220.172 count: 9
5.135.134.16 count: 5
18.213.10.181 count: 3


This is the same idea as Steeldriver's awk approach, but in Perl. The -a causes perl to automatically split each input line into the array @F, whose first element (the IP) is $F[0]. So, $k$F[0]++ will create the hash %k, whose keys are the IPs and whose values are the number of times each IP was seen. The { is funky perlspeak for "do the rest at the very end, after processing all input". So, at the end, the script will iterate over the keys of the hash and print the current key ($_) along with its value ($k$_).



And, just so people don't think that perl forces you to write script that look like cryptic scribblings, this is the same thing in a less condensed form:



perl -e '
while (my $line=<STDIN>)
@fields = split(/ /, $line);
$ip = $fields[0];
$counts$ip++;

foreach $ip (keys(%counts))
print "$ip count: $counts$ipn"
' < log






share|improve this answer












share|improve this answer



share|improve this answer










answered Mar 29 at 16:14









terdonterdon

67.6k13139223




67.6k13139223





















      4














      Maybe this is not what the OP want; however, if we know that the IP address length will be limited to 15 characters, a quicker way to display the counts with unique IPs from a huge log file can be achieved using uniq command alone:



      $ uniq -w 15 -c log

      5 5.135.134.16 - - [23/Mar/2019:08:42:54 -0400] ...
      9 13.57.220.172 - - [23/Mar/2019:11:01:05 -0400] ...
      1 13.57.233.99 - - [23/Mar/2019:04:17:45 -0400] ...
      2 18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] ...
      3 18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] ...


      Options:



      -w N compares no more than N characters in lines



      -c will prefix lines by the number of occurrences



      Alternatively, For exact formatted output I prefer awk (should also work for IPV6 addresses), ymmv.



      $ awk 'NF print $1 ' log | sort -h | uniq -c | awk 'printf "%s count: %dn", $2,$1 '

      5.135.134.16 count: 5
      13.57.220.172 count: 9
      13.57.233.99 count: 1
      18.206.226.75 count: 2
      18.213.10.181 count: 3


      Note that uniq won't detect repeated lines in the input file if they are not adjacent, so it may be necessary to sort the file.






      share|improve this answer




















      • 1





        Likely good enough in practice, but worth noting the corner cases. Only 6 probably constant characters after the IP ` - - [`. But in theory the address could be up to 8 characters shorter than the maximum so a change of date could split the count for such an IP. And as you hint, this won't work for IPv6.

        – Martin Thornton
        Mar 29 at 23:17












      • I like it, I didnt know uniq could count!

        – j0h
        Mar 31 at 12:57















      4














      Maybe this is not what the OP want; however, if we know that the IP address length will be limited to 15 characters, a quicker way to display the counts with unique IPs from a huge log file can be achieved using uniq command alone:



      $ uniq -w 15 -c log

      5 5.135.134.16 - - [23/Mar/2019:08:42:54 -0400] ...
      9 13.57.220.172 - - [23/Mar/2019:11:01:05 -0400] ...
      1 13.57.233.99 - - [23/Mar/2019:04:17:45 -0400] ...
      2 18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] ...
      3 18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] ...


      Options:



      -w N compares no more than N characters in lines



      -c will prefix lines by the number of occurrences



      Alternatively, For exact formatted output I prefer awk (should also work for IPV6 addresses), ymmv.



      $ awk 'NF print $1 ' log | sort -h | uniq -c | awk 'printf "%s count: %dn", $2,$1 '

      5.135.134.16 count: 5
      13.57.220.172 count: 9
      13.57.233.99 count: 1
      18.206.226.75 count: 2
      18.213.10.181 count: 3


      Note that uniq won't detect repeated lines in the input file if they are not adjacent, so it may be necessary to sort the file.






      share|improve this answer




















      • 1





        Likely good enough in practice, but worth noting the corner cases. Only 6 probably constant characters after the IP ` - - [`. But in theory the address could be up to 8 characters shorter than the maximum so a change of date could split the count for such an IP. And as you hint, this won't work for IPv6.

        – Martin Thornton
        Mar 29 at 23:17












      • I like it, I didnt know uniq could count!

        – j0h
        Mar 31 at 12:57













      4












      4








      4







      Maybe this is not what the OP want; however, if we know that the IP address length will be limited to 15 characters, a quicker way to display the counts with unique IPs from a huge log file can be achieved using uniq command alone:



      $ uniq -w 15 -c log

      5 5.135.134.16 - - [23/Mar/2019:08:42:54 -0400] ...
      9 13.57.220.172 - - [23/Mar/2019:11:01:05 -0400] ...
      1 13.57.233.99 - - [23/Mar/2019:04:17:45 -0400] ...
      2 18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] ...
      3 18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] ...


      Options:



      -w N compares no more than N characters in lines



      -c will prefix lines by the number of occurrences



      Alternatively, For exact formatted output I prefer awk (should also work for IPV6 addresses), ymmv.



      $ awk 'NF print $1 ' log | sort -h | uniq -c | awk 'printf "%s count: %dn", $2,$1 '

      5.135.134.16 count: 5
      13.57.220.172 count: 9
      13.57.233.99 count: 1
      18.206.226.75 count: 2
      18.213.10.181 count: 3


      Note that uniq won't detect repeated lines in the input file if they are not adjacent, so it may be necessary to sort the file.






      share|improve this answer















      Maybe this is not what the OP want; however, if we know that the IP address length will be limited to 15 characters, a quicker way to display the counts with unique IPs from a huge log file can be achieved using uniq command alone:



      $ uniq -w 15 -c log

      5 5.135.134.16 - - [23/Mar/2019:08:42:54 -0400] ...
      9 13.57.220.172 - - [23/Mar/2019:11:01:05 -0400] ...
      1 13.57.233.99 - - [23/Mar/2019:04:17:45 -0400] ...
      2 18.206.226.75 - - [23/Mar/2019:21:58:07 -0400] ...
      3 18.213.10.181 - - [23/Mar/2019:14:45:42 -0400] ...


      Options:



      -w N compares no more than N characters in lines



      -c will prefix lines by the number of occurrences



      Alternatively, For exact formatted output I prefer awk (should also work for IPV6 addresses), ymmv.



      $ awk 'NF print $1 ' log | sort -h | uniq -c | awk 'printf "%s count: %dn", $2,$1 '

      5.135.134.16 count: 5
      13.57.220.172 count: 9
      13.57.233.99 count: 1
      18.206.226.75 count: 2
      18.213.10.181 count: 3


      Note that uniq won't detect repeated lines in the input file if they are not adjacent, so it may be necessary to sort the file.







      share|improve this answer














      share|improve this answer



      share|improve this answer








      edited Mar 31 at 12:13

























      answered Mar 29 at 18:38









      Y. PradhanY. Pradhan

      412




      412







      • 1





        Likely good enough in practice, but worth noting the corner cases. Only 6 probably constant characters after the IP ` - - [`. But in theory the address could be up to 8 characters shorter than the maximum so a change of date could split the count for such an IP. And as you hint, this won't work for IPv6.

        – Martin Thornton
        Mar 29 at 23:17












      • I like it, I didnt know uniq could count!

        – j0h
        Mar 31 at 12:57












      • 1





        Likely good enough in practice, but worth noting the corner cases. Only 6 probably constant characters after the IP ` - - [`. But in theory the address could be up to 8 characters shorter than the maximum so a change of date could split the count for such an IP. And as you hint, this won't work for IPv6.

        – Martin Thornton
        Mar 29 at 23:17












      • I like it, I didnt know uniq could count!

        – j0h
        Mar 31 at 12:57







      1




      1





      Likely good enough in practice, but worth noting the corner cases. Only 6 probably constant characters after the IP ` - - [`. But in theory the address could be up to 8 characters shorter than the maximum so a change of date could split the count for such an IP. And as you hint, this won't work for IPv6.

      – Martin Thornton
      Mar 29 at 23:17






      Likely good enough in practice, but worth noting the corner cases. Only 6 probably constant characters after the IP ` - - [`. But in theory the address could be up to 8 characters shorter than the maximum so a change of date could split the count for such an IP. And as you hint, this won't work for IPv6.

      – Martin Thornton
      Mar 29 at 23:17














      I like it, I didnt know uniq could count!

      – j0h
      Mar 31 at 12:57





      I like it, I didnt know uniq could count!

      – j0h
      Mar 31 at 12:57











      1














      FWIW, Python 3:



      from collections import Counter

      with open('sample.log') as file:
      counts = Counter(line.split()[0] for line in file)

      for ip_address, count in counts.items():
      print('%-15s count: %d' % (ip_address, count))


      Output:



      13.57.233.99 count: 1
      18.213.10.181 count: 3
      5.135.134.16 count: 5
      18.206.226.75 count: 2
      13.57.220.172 count: 9





      share|improve this answer





























        1














        FWIW, Python 3:



        from collections import Counter

        with open('sample.log') as file:
        counts = Counter(line.split()[0] for line in file)

        for ip_address, count in counts.items():
        print('%-15s count: %d' % (ip_address, count))


        Output:



        13.57.233.99 count: 1
        18.213.10.181 count: 3
        5.135.134.16 count: 5
        18.206.226.75 count: 2
        13.57.220.172 count: 9





        share|improve this answer



























          1












          1








          1







          FWIW, Python 3:



          from collections import Counter

          with open('sample.log') as file:
          counts = Counter(line.split()[0] for line in file)

          for ip_address, count in counts.items():
          print('%-15s count: %d' % (ip_address, count))


          Output:



          13.57.233.99 count: 1
          18.213.10.181 count: 3
          5.135.134.16 count: 5
          18.206.226.75 count: 2
          13.57.220.172 count: 9





          share|improve this answer















          FWIW, Python 3:



          from collections import Counter

          with open('sample.log') as file:
          counts = Counter(line.split()[0] for line in file)

          for ip_address, count in counts.items():
          print('%-15s count: %d' % (ip_address, count))


          Output:



          13.57.233.99 count: 1
          18.213.10.181 count: 3
          5.135.134.16 count: 5
          18.206.226.75 count: 2
          13.57.220.172 count: 9






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Mar 31 at 17:34

























          answered Mar 31 at 17:25









          wjandreawjandrea

          9,55142765




          9,55142765





















              0














              cut -f1 -d- my.log | sort | uniq -c


              Explanation: Take the first field of my.log splitting on dashes - and sort it. uniq needs sorted input. -c tells it to count occurrences.






              share|improve this answer





























                0














                cut -f1 -d- my.log | sort | uniq -c


                Explanation: Take the first field of my.log splitting on dashes - and sort it. uniq needs sorted input. -c tells it to count occurrences.






                share|improve this answer



























                  0












                  0








                  0







                  cut -f1 -d- my.log | sort | uniq -c


                  Explanation: Take the first field of my.log splitting on dashes - and sort it. uniq needs sorted input. -c tells it to count occurrences.






                  share|improve this answer















                  cut -f1 -d- my.log | sort | uniq -c


                  Explanation: Take the first field of my.log splitting on dashes - and sort it. uniq needs sorted input. -c tells it to count occurrences.







                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Mar 31 at 17:04









                  wjandrea

                  9,55142765




                  9,55142765










                  answered Mar 30 at 18:01









                  PhDPhD

                  101




                  101



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Ask Ubuntu!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1129521%2fhow-to-count-occurrences-of-text-in-a-file%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                      Tähtien Talli Jäsenet | Lähteet | NavigointivalikkoSuomen Hippos – Tähtien Talli

                      Do these cracks on my tires look bad? The Next CEO of Stack OverflowDry rot tire should I replace?Having to replace tiresFishtailed so easily? Bad tires? ABS?Filling the tires with something other than air, to avoid puncture hassles?Used Michelin tires safe to install?Do these tyre cracks necessitate replacement?Rumbling noise: tires or mechanicalIs it possible to fix noisy feathered tires?Are bad winter tires still better than summer tires in winter?Torque converter failure - Related to replacing only 2 tires?Why use snow tires on all 4 wheels on 2-wheel-drive cars?