You are browsing a read-only backup copy of Wikitech. The live site can be found at wikitech.wikimedia.org

User:Jbond/debuging

From Wikitech-static
< User:Jbond
Revision as of 11:24, 22 June 2021 by imported>Jbond (→‎mw server)
Jump to navigation Jump to search

Logs

https://wikitech.wikimedia.org/wiki/Logs

Sampled-1000.json on centrallog1001

Grep-able oputput

$ jq  -r "[.uri_path,.hostname,.user_agent,.ip] | @csv" /srv/log/webrequest/sampled-1000.json

Select all requests with a specific user_agent and .referer

$ jq -r 'if .user_agent == "-" and .referer == "-" then [.uri_path,.hostname,.user_agent,.ip] else empty end | @csv' /srv/log/webrequest/sampled-1000.json

List of the top 10 IPs by response size

$ head -n 2560000 /srv/log/webrequest/sampled-1000.json | jq -r '.ip + " " + (.response_size | tostring)' | awk '{ sum[$1] += $2 } END { for (ip in sum) print sum[ip],ip }' | sort -nr | head -10

mw server

list all ips which have made more the 100 large requests

$ awk '$2>60000 {print $11}' /var/log/apache2/other_vhosts_access.log | sort | uniq -c | awk '$1>100 {print}'

MediaWiki Shell

$ ssh mwmaint1002
$ mwscript maintenance/shell.php --wiki=enwiki

Then

>>> var_dump($wgUpdateRowsPerQuery);
int(100)
=> null
>>>

LVS Server

Sample 100k pkts and list top talkers

$ sudo tcpdump -i enp4s0f0 -pn -c 100000 | sed -r 's/.* IP6? //;s/\.[^\.]+ .*//' | sort | uniq -c | sort -nr | head -20

Testig a site agains a specific lvs

$ curl --connect-to "::text-lb.${site}.wikimedia.org" https://en.wikipedia.org/wiki/Main_Page?x=$RANDOM

CP Server

Check the connection tuples for the varnish

$ sudo ss -tan 'sport = :3120' | awk '{print $(NF)" "$(NF-1)}' | sed 's/:[^ ]*//g' | sort | uniq -c

The number of avaible ports which also maps to tuples is available from if the number above is equal to approaching the number of available ports from below then there could ba en issue

$ cat /proc/sys/net/ipv4/ip_local_port_range

Checking sites from CP server

You can use curl from the cp serveres to ensure you fiut the front end/back end cache and for it to hit fetch a specific site with the following commands

Using $RANDOM below prevents us from hitting the cache

frontend

$ curl --connect-to "::$HOSTNAME" https://en.wikipedia.org/wiki/Main_Page?x=$RANDOM

backend

$ curl --connect-to "::$HOSTNAME:3128"   -H "X-Forwarded-Proto: https"" https://en.wikipedia.org/wiki/Main_Page?x=$RANDOM

Proxed web service

Show all request and response headeres on loopback

$ sudo stdbuf -oL -eL /usr/sbin/tcpdump -Ai lo -s 10240 "tcp port 80 and (((ip[2:2] - ((ip[0]&0xf)<<2)) - ((tcp[12]&0xf0)>>2)) != 0)" | egrep -a --line-buffered ".+(GET |HTTP\/|POST )|^[A-Za-z0-9-]+: " | perl -nle 'BEGIN{$|=1} { s/.*?(GET |HTTP\/[0-9.]* |POST )/\n$1/g; print }'

re: https://serverfault.com/a/633452/464916

Pooling

Check the pooled state

Servcie

$ confctl select service=thumbor get

host

confctl select dc=eqiad,cluster=cache_text,service=varnish-be,name=cp1052.eqiad.wmnet get

Depooling

https://wikitech.wikimedia.org/wiki/Depooling_servers

pybal

Check log files /var/log/pybal.log on lvs servers