Post Reply 
ProxHTTPSProxyMII: Reloaded
May. 09, 2018, 02:48 PM
Post: #211
RE: ProxHTTPSProxyMII: Reloaded
(May. 09, 2018 07:03 AM)ryszardzonk Wrote:  To make sure that message is not coming from my first proxy in chain I skipped squid and pointed browser to use ProxHTTPSProxyMII for https and privoxy for http.

Did you point browser to use ProxHTTPSProxyMII front server at 3129?

I gotta go.
Add Thank You Quote this message in a reply
May. 09, 2018, 04:09 PM (This post was last modified: May. 10, 2018 06:05 PM by ryszardzonk.)
Post: #212
RE: ProxHTTPSProxyMII: Reloaded
(May. 09, 2018 02:48 PM)JJoe Wrote:  Did you point browser to use ProxHTTPSProxyMII front server at 3129?

Yes I did make sure of that. Also checked behaviour on Windows 10 & Edge (setting up proxy here was a bit tricky as it is done in system not in the browser) with same result just somewhat different certificate installation process.

Than I was thinking maybe it has something to do with the software versions I use in Gentoo Linux server.
- dev-lang/python-3.5.5-r1
- dev-python/colorama-0.3.9
- dev-python/pyopenssl-17.5.0
- dev-python/PySocks-1.6.7 (there is pysocks 1.6.8 available upstream)
- dev-python/urllib3-1.22

Than it hit me. I started ProxHTTPSProxyMII in debug mode and so errors in the log and to my surprise few properly working GETS. It turned out that in browser when I clicked certain website for the first time I got an error. For second time it worked.
Code:
[17:55] 000 "[SSL: SSLV3_ALERT_BAD_CERTIFICATE] sslv3 alert bad certificate (_ssl.c:2091)" while trying to establish local SSL tunnel for [forum.openstreetmap.org:443]
[17:55] 193 [D] "GET https://forum.openstreetmap.org/index.php" 200 5198
[17:55] 194 [D] "GET https://forum.openstreetmap.org/style/Air.css" 200 5684
[17:55] 196 [D] "GET https://forum.openstreetmap.org/style/Air/img/feed.png" 200 439
[17:55] 195 [D] "GET https://forum.openstreetmap.org/style/Air/img/bull.png" 200 107
[17:55] 197 [D] "GET https://forum.openstreetmap.org/favicon.ico" 404 299
[17:55] 198 [D] "GET https://forum.openstreetmap.org/favicon.ico" 404 299
[17:56] 199 [D] "GET https://forum.openstreetmap.org/viewtopic.php?id=11155" 200 -
[17:56] 201 [D] "GET https://forum.openstreetmap.org/img/smilies/big_smile.png" 200 373
[17:56] 200 [D] "GET https://forum.openstreetmap.org/img/smilies/smile.png" 200 426
[17:56] 202 [D] "GET https://forum.openstreetmap.org/img/smilies/wink.png" 200 428
[17:56] 203 [D] "GET https://forum.openstreetmap.org/style/Air/img/ext.png" 200 130

So it seems on the first try ProxHTTPSProxyMII saves required certificate but does not use it and sets up proper connection on the next try when it already has required certificate on the disk... uffff...

Anyways Thumbs Up and awaiting that fix for using certificate the first time too Wink

EDIT: I failed to get working whole chain Squid / ProxHTTPSProxyMII / privoxy / ProxHTTPSProxyMII and get number of SSL errors
So far I have been able to get working either:
- Squid (http&https) / privoxy (http only)
- ProxHTTPSProxyMII / privoxy / ProxHTTPSProxyMII

It might be that squid by doing its own ssl mangling somehow will not work with ProxHTTPSProxyMII which does recreate ssl certificates too. Maybe the solution could be installing that CA.crt somewhere in the system for squid?

EDIT2:
I tried like 100 different combinations for squid-3.5.27 parent proxy cache_peer & sslproxy_flags which would help accepting ssl connection from ProxHTTPSProxyMII to squid like
Code:
sslflags=DONT_VERIFY_PEER,DONT_VERIFY_DOMAIN
sslcapath=/opt/proxhttpsproxy/
sslcafile=/opt/proxhttpsproxy/proxhttpsproxy.pem (CA.crt file without private key)
but without success and got errors in squid as such as those on every try

Code:
2018/05/10 19:25:18 kid1| Error negotiating SSL on FD 12: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol (1/-1/0)
2018/05/10 19:25:18 kid1| TCP connection to 127.0.0.1/3129 failed

2018/05/10 19:42:31 kid1| Error negotiating SSL on FD 13: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol (1/-1/0)
2018/05/10 19:42:31 kid1| TCP connection to 127.0.0.1/3129 failed

while response from ProxHTTPSProxyMII to squids request looked as follows

Code:
127.0.0.1 - - [10/May/2018 20:02:06] code 400, message Bad request syntax ('\x16\x03\x01\x013\x01\x00\x01/\x03\x03¡1d\xad¿²\x81C¯\x0f{\x8câøÛÑxñ\x94úc3y\x97î²xå=Ü-\x87\x00\x00¬À0À,À(À$À\x14À')
127.0.0.1 - - [10/May/2018 20:02:06] "3/¡1d­¿²C¯{ŒâøÛÑxñ”úc3yî²xå=Ü-‡¬À0À,À(À$ÀÀ" 400 -
127.0.0.1 - - [10/May/2018 20:02:06] code 400, message Bad request syntax ('\x16\x03\x01\x013\x01\x00\x01/\x03\x03©ÎðÁ\x16\x96\x94#\x1aC\x82hÿeK=wÎD\x9f£%½7xVc\x98¾Ü>\x00\x00¬À0À,À(À$À\x14À')
127.0.0.1 - - [10/May/2018 20:02:06] "3/©ÎðÁ”#C‚hÿeK=wÎDŸ£%½7xVc˜¾Ü>¬À0À,À(À$ÀÀ" 400 -
127.0.0.1 - - [10/May/2018 20:02:06] code 400, message Bad request syntax ("\x16\x03\x01\x013\x01\x00\x01/\x03\x03¹\x86U\x08]x\x1b¶'\x9e\x18\x08\x02KX\xad8©Ù\x9c»I*\x9bå\x95\x87.Õ\x9fA_\x00\x00¬À0À,À(À$À\x14À")
127.0.0.1 - - [10/May/2018 20:02:06] "3/¹†]x¶'žKX­8©Ùœ»I*›å•‡.՟A_¬À0À,À(À$ÀÀ" 400 -
127.0.0.1 - - [10/May/2018 20:02:06] code 400, message Bad request version ('ïûûÇ\x89Æ\x921Î:`\x00\x00¬À0À,À(À$À\x14À')
127.0.0.1 - - [10/May/2018 20:02:06] "3/Ûry%É4$Ç6¿øÅ©ÄýÞñì ïûûljƒ1Î:`¬À0À,À(À$ÀÀ" 400 -

Is it even possible to connect the two together?
Add Thank You Quote this message in a reply
May. 11, 2018, 02:42 AM
Post: #213
RE: ProxHTTPSProxyMII: Reloaded
(May. 09, 2018 04:09 PM)ryszardzonk Wrote:  So it seems on the first try ProxHTTPSProxyMII saves required certificate but does not use it and sets up proper connection on the next try when it already has required certificate on the disk... uffff...

Anyways Thumbs Up and awaiting that fix for using certificate the first time too Wink

I haven't noticed this. I have rarely seen the connection timing out before the address is resolved on the first visit. The cached certificate might help but it has been too rare to study.
When this happens on Windows the ProxHTTPSProxyMII log shows something like

Quote:(Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x05C67630>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed',))

This is more of a missing feature of urllib3 than a bug. The time limit can be increased.

(May. 09, 2018 04:09 PM)ryszardzonk Wrote:  It might be that squid by doing its own ssl mangling somehow will not work with ProxHTTPSProxyMII which does recreate ssl certificates too. Maybe the solution could be installing that CA.crt somewhere in the system for squid?

Browser>>ProxHTTPSProxyMII front>>Privoxy>>ProxHTTPSProxyMII Rear>>Squid

Assuming that you haven't disabled verification in ProxHTTPSProxyMII, did you add Squid's certificate to ProxHTTPSProxyMII certificate store (cacert.pem), like you added ProxHTTPSProxyMII's to the browser's?

If the chain is as above, you could disable verification in ProxHTTPSProxyMII and certificate creation in Squid. Then ProxHTTPSProxyMII would hide the mitm from the browser and Squid would verify the site's certificates.

(May. 09, 2018 04:09 PM)ryszardzonk Wrote:  EDIT2:
I tried like 100 different combinations for squid-3.5.27 parent proxy cache_peer & sslproxy_flags which would help accepting ssl connection from ProxHTTPSProxyMII to squid like
Code:
sslflags=DONT_VERIFY_PEER,DONT_VERIFY_DOMAIN
sslcapath=/opt/proxhttpsproxy/
sslcafile=/opt/proxhttpsproxy/proxhttpsproxy.pem (CA.crt file without private key)

What about:

Code:
sslcipher=...    The list of valid SSL ciphers to use when connecting
            to this peer.
    
ssloptions=...     Specify various SSL implementation options:

                NO_SSLv2    Disallow the use of SSLv2
                NO_SSLv3    Disallow the use of SSLv3
                NO_TLSv1    Disallow the use of TLSv1.0
                NO_TLSv1_1  Disallow the use of TLSv1.1
                NO_TLSv1_2  Disallow the use of TLSv1.2

                SINGLE_DH_USE
                      Always create a new key when using
                      temporary/ephemeral DH key exchanges

                NO_TICKET
                      Disable use of RFC5077 session tickets. Some servers
                      may have problems understanding the TLS extension due
                      to ambiguous specification in RFC4507.

                ALL       Enable various bug workarounds
                      suggested as "harmless" by OpenSSL
                      Be warned that this reduces SSL/TLS
                      strength to some attacks.

            See the OpenSSL SSL_CTX_set_options documentation for a
            more complete list.

Because, the errors below seem to show a failure to match protocols and/or cyphers.
https://www.ssllabs.com/ssltest/viewMyClient.html may help.

(May. 09, 2018 04:09 PM)ryszardzonk Wrote:  
Code:
2018/05/10 19:25:18 kid1| Error negotiating SSL on FD 12: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol (1/-1/0)
2018/05/10 19:25:18 kid1| TCP connection to 127.0.0.1/3129 failed
...

while response from ProxHTTPSProxyMII to squids request looked as follows

Code:
127.0.0.1 - - [10/May/2018 20:02:06] code 400, message Bad request syntax ('\x16\x03\x01\x013\x01\x00\x01/\x03\x03¡1d\xad¿²\x81C¯\x0f{\x8câøÛÑxñ\x94úc3y\x97î²xå=Ü-\x87\x00\x00¬À0À,À(À$À\x14À')
...

(May. 09, 2018 04:09 PM)ryszardzonk Wrote:  Is it even possible to connect the two together?

Probably Wink
JJoe Wrote:I think I understand but I haven't actually done it.
So...
Add Thank You Quote this message in a reply
May. 11, 2018, 05:23 PM (This post was last modified: May. 11, 2018 05:26 PM by ryszardzonk.)
Post: #214
RE: ProxHTTPSProxyMII: Reloaded
One thing I noticed for the past two weeks I started fooling with caching/filtering SSL stuff is that is like steeping on the mine field. Fixing one problem leads to 2 more and the chain seems infinite. Having that said now to the fun part In Awe
(May. 11, 2018 02:42 AM)JJoe Wrote:  I haven't noticed this. I have rarely seen the connection timing out before the address is resolved on the first visit. The cached certificate might help but it has been too rare to study.
When this happens on Windows the ProxHTTPSProxyMII log shows something like

Quote:(Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x05C67630>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed',))

This is more of a missing feature of urllib3 than a bug. The time limit can be increased.
I do not notice much of a delay browsing the web pages on my i3 Ironlake server for it to be timing issue. If it is however than any idea what I might change to increase time for a website to react properly?

Urllib on the other hand has over 100 open bugs. Some of the include certificates and one even certificates and squid.
https://github.com/urllib3/urllib3/issues/1384
https://github.com/urllib3/urllib3/issues/476

(May. 11, 2018 02:42 AM)JJoe Wrote:  Assuming that you haven't disabled verification in ProxHTTPSProxyMII, did you add Squid's certificate to ProxHTTPSProxyMII certificate store (cacert.pem), like you added ProxHTTPSProxyMII's to the browser's?
I have not disabled verification in ProxHTTPSProxyMII, and it was very good idea to add my local Squid certificate to cacert.pem, but unfortunetly it did not work.

My other idea was to use newest beta version of squid 4.0.24 with patch as it was to help with ssl, but that did not work either ( and yes I did update squid.conf to changes introduced in version 4)

(May. 11, 2018 02:42 AM)JJoe Wrote:  What about:
sslcipher=... The list of valid SSL ciphers to use when connecting
to this peer.
ssloptions=... Specify various SSL implementation options:
I tried few more of those including those that disable some tls options like tls-options=NO_TICKET, tls-min-version=1.2 but with no luck either.

By using tls-min-version=1.2 I even managed ProxHTTPSProxyMII to throw an exception after which to my surprise it kept working
Code:
Exception happened during processing of request from ('127.0.0.1', 36972)
Traceback (most recent call last):
  File "/usr/lib64/python3.5/socketserver.py", line 625, in process_request_thread
    self.finish_request(request, client_address)
  File "/usr/lib64/python3.5/socketserver.py", line 354, in finish_request
    self.RequestHandlerClass(request, client_address, self)
  File "/usr/lib64/python3.5/socketserver.py", line 681, in __init__
    self.handle()
  File "/usr/lib64/python3.5/http/server.py", line 422, in handle
    self.handle_one_request()
  File "/opt/proxhttpsproxy/ProxyTool.py", line 115, in handle_one_request
    BaseHTTPRequestHandler.handle_one_request(self)
  File "/usr/lib64/python3.5/http/server.py", line 410, in handle_one_request
    method()
  File "ProxHTTPSProxy.py", line 196, in do_METHOD
    if any((fnmatch.fnmatch(self.host, pattern) for pattern in pools.blacklist)):
  File "ProxHTTPSProxy.py", line 196, in <genexpr>
    if any((fnmatch.fnmatch(self.host, pattern) for pattern in pools.blacklist)):
  File "/usr/lib64/python3.5/fnmatch.py", line 34, in fnmatch
    name = os.path.normcase(name)
  File "/usr/lib/python-exec/python3.5/../../../lib64/python3.5/posixpath.py", line 54, in normcase
    "not '{}'".format(s.__class__.__name__))
TypeError: normcase() argument must be str or bytes, not 'NoneType'
----------------------------------------

(May. 11, 2018 02:42 AM)JJoe Wrote:  Browser>>ProxHTTPSProxyMII front>>Privoxy>>ProxHTTPSProxyMII Rear>>Squid

If the chain is as above, you could disable verification in ProxHTTPSProxyMII and certificate creation in Squid. Then ProxHTTPSProxyMII would hide the mitm from the browser and Squid would verify the site's certificates.

This sounds very reasonable and just might work. I did not try it so far as I wanted to check all options at current configuration cause there is one thing that might be problematic with it. When I use iptables to direct 443 network traffic to 3129 (ProxHTTPSProxyMII FrontPort) it does not work for me. ProxHTTPSProxyMII would not accept intercepted traffic as squid does with intercept flag for transparent proxy.
Add Thank You Quote this message in a reply
May. 11, 2018, 08:16 PM (This post was last modified: May. 11, 2018 08:28 PM by vlad_s.)
Post: #215
RE: ProxHTTPSProxyMII: Reloaded
ryszardzonk
Yes, I use squid to transparently proxy. Here are all the configs:
privoxy
Code:
user-manual /usr/share/doc/privoxy/user-manual

confdir /etc/privoxy

logdir /var/log/privoxy
logfile logfile

# Действия и фильтры.
actionsfile match-all.action
actionsfile default.action
actionsfile adblock.action
actionsfile rpft.action
actionsfile antiban.action
actionsfile user.action
actionsfile network-set.action
actionsfile proxhttpsproxy.action

filterfile default.filter
filterfile rpft.filter
filterfile user.filter
filterfile proxhttpsproxy.filter

# Уровень отладки.
#debug     1 # Log the destination for each request Privoxy let through. See also debug 1024.
#debug     2 # show each connection status
#debug     4 # show I/O status
#debug     8 # show header parsing
#debug    16 # log all data written to the network
#debug    32 # debug force feature
#debug    64 # debug regular expression filters
#debug   128 # debug redirects
#debug   256 # debug GIF de-animation
#debug   512 # Common Log Format
#debug  1024 # Log the destination for requests Privoxy didn't let through, and the reason why.
#debug  1025 # Показать только прошедшие и блокированные запросы.
#debug  2048 # CGI user interface
#debug  4096 # Startup banner and warnings.
#debug  8192 # Non-fatal errors
#debug 32768 # log all data read from the network
#debug 65536 # Log the applying actions

# Слушать адрес и порт.
listen-address 192.168.2.1:8118
listen-address 127.0.0.1:8118

# Включить прокси, 0 - выключить.
toggle 1

# Вкл/выкл. прокси из web-настроек (http://config.privoxy.org)
enable-remote-toggle 1

# Управлять вкл/выкл. фильтрации посредством заголовков HTTP.
enable-remote-http-toggle 0

# Включает возможность редактирования действий и фильров из web-настроек.
enable-edit-actions 1

# Разрешить ли пользователю игнорировать блокировку и предлагать "пойти туда в любом случае".
enforce-blocks 0

# Максимальный размер буфера для фильтрации контента.
buffer-limit 4096

# Аутентификация через прокси.
enable-proxy-authentication-forwarding 0

# Количество повторов, если переадресация не удаётся.
forwarded-connect-retries 0

# Включить "прозрачный" режим прокси (так же нужно править таблицу nat цепочки PREROUTING в iptables после включения этой опции).
accept-intercepted-requests 0

# Должны ли запросы CGI страниц быть заблокированы или перенаправлены.
allow-cgi-request-crunching 0

# Может ли интерфейс CGI оставаться совместимым со сломанными HTTP клиентами.
split-large-forms 0

# Количество секунд, по истечении которого открытое соединение больше не будет использоваться повторно.
keep-alive-timeout 30

# Должны ли быть pipelined-запросы.
tolerate-pipelining 1

# Количество секунд, после которого сокет ждёт, если данные не получены.
socket-timeout 300

# Максимальное число подключений, если не задано, то 128.
max-client-connections 512

# Порядок сортировки заголовков перед их отправкой (умолчание none).
#client-header-order Host \
#   Accept \
#   Accept-Language \
#   Accept-Encoding \
#   Proxy-Connection \
#   Referer \
#   Cookie \
#   DNT \
#   If-Modified-Since \
#   Cache-Control \
#   Content-Length \
#   Content-Type

proxhttpsproxy
Code:
### The parent proxy has to support CONNECT method, if you want to proxy HTTPS requests
###
### Proxy setting applies to HTTPS requests only, as it is applied by the Rear Server
### HTTP requests are passed to and handled by Proxomitron, please set up Proxomitron for proxy

[GENERAL]
ProxAddr = http://localhost:8118
FrontPort = 8079
RearPort = 8081
# DefaultProxy = http://127.0.0.1:8118

# Proper values for LogLevel are ERROR, WARNING, INFO, DEBUG
# Default is INFO if unset
LogLevel = WARNING

# * matches everything
# ? matches any single character
# [seq] matches any character in seq
# [!seq] matches any character not in seq

[PROXY http://192.168.8.100:8888]
*sbrf.ru
*sberbank.ru
*sberbank.com
sbi.sberbank.ru

[PROXY http://127.0.0.1:8118]
#*
*.i2p
*.onion

[PROXY socks5://127.0.0.1:9050]
*.inbox.lv
*.linkedin.com

### Ignore SSL certificate verify, Use at your own risk!!!
### Proxy setting still effective
[SSL No-Verify]
192.168.2.1
home-router

[BLACKLIST]

### Bypass Proxomitron and the Rear Server, Proxy setting still effective
### SSL certificate verify will be done by the browser
[SSL Pass-Thru]
*.plex.direct
*.plex.tv
plex.tv
pypi.python.org
sls.update.microsoft.com
storeedgefd.dsx.mp.microsoft.com
settings-ssl.xboxlive.com
collections.md.mp.microsoft.com
paymentinstruments.mp.microsoft.com
musicdelivery-ssl.xboxlive.com
notepad-plus-plus.org

# Microsoft SmartScreen Filter Service
*.smartscreen.microsoft.com
urs.microsoft.com

# NoScript uses https://secure.informaction.com/ipecho to detect the WAN IP
# https://addons.mozilla.org/en-US/firefox/addon/noscript/privacy/
secure.informaction.com

### Bypass Proxomitron and the Rear Server, Proxy setting still effective
### This section supports URL matching
[BYPASS URL]
http://www.abc.com/*
https://bcd.net/*
*://feedly.com/*
*.zip
*.rar
*.exe
*.pdf

squid
Code:
# Доступ для локальной сети.
acl localnet src 192.168.0.0/16
acl localnet src fc00::/7
acl localnet src fe80::/64
acl globalIPv6 src ipv6

# Список портов, к которым разрешен доступ через прокси-сервер по протоколу HTTP.
acl ftp proto FTP
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
#acl directlist url_regex -i "/etc/squid/directlist"

# Устанавливает TTL(время жизни) кэшированных сообщений об ошибках(например, "connection refused" или "404 Not Found").
negative_ttl 0 seconds

# Squid использует тот же dns сервер, что и клиенты.
dns_nameservers 192.168.2.1

# Устраняет ошибку "WARNING! Your cache is running out of filedescriptors" и возможно "read/write failure: (32) Broken pipe".
max_filedescriptors 8192

# Включить режим строгой проверки соответствия имени хоста, указанного через Host, и доменного имени.
#host_verify_strict on

# Сделать squid анонимным.
via off
forwarded_for off
follow_x_forwarded_for deny all
request_header_access cache-control deny all

# Запретить доступ к портам, отсутствующим в списке выше.
http_access deny !Safe_ports

# Запретить метод CONNECT не на SSL-порт.
http_access deny CONNECT !SSL_ports

# Разрешить только локальное управление кэшем.
http_access allow localhost manager
http_access deny manager

# Разрешить протокол IPv6.
http_access allow globalIPv6

# Доступ для клиентов локальной сети.
http_access allow localnet

# Доступ для сервера.
http_access allow localhost

# Всем остальным доступ запретить (deny).
http_access deny all

# Порт 3128 для не прозрацного прокси, для прозрачного http 3129, для https 3130.
http_port 3128
http_port 3129 intercept
https_port 3130 intercept ssl-bump connection-auth=off options=ALL cert=/etc/squid/squidCA.pem

# Принимаем сертификаты, даже если они не прошли проверку.
sslproxy_flags DONT_VERIFY_PEER

# Отключить проверку сертификатов на сервере.
sslproxy_cert_error allow all

# Укажем правило со списком блокируемых ресурсов (в файле домены вида .domain.com).
# Если закомментировать #1, #2, #3, то squid (а так же вся цепочка проксей) не будет смотреть SNI.
#acl blocked ssl::server_name "/etc/squid/blocklist" #1
acl step1 at_step SslBump1
ssl_bump peek step1

# Терминируем соединение, если клиент заходит на запрещенный ресурс.
#ssl_bump terminate blocked #2
#ssl_bump splice all #3

# Путь сохранения дампов аварийного завершения.
coredump_dir /var/spool/squid

refresh_pattern ^ftp:           1440    20%     10080
refresh_pattern ^gopher:        1440    0%      1440
refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
refresh_pattern .               0       20%     4320

# Кэш в памями и максимальный объем кэшированного объекта.
cache_mem 4096 MB
maximum_object_size_in_memory 4096 KB

# Размер дискового кэша и его расположение.
cache_dir ufs /var/spool/squid 2048 16 256

# Максимальный размер объекта в дисковом кэше.
maximum_object_size 4 MB

# Место хранения логов и ротация.
access_log daemon:/var/log/squid/access.log squid
logfile_rotate 10
debug_options ALL,1 85,0

# Вывод страниц ошибок на русском языке.
error_directory /usr/share/squid/errors/ru

# Вышестоящий http-прокси.
cache_peer 127.0.0.1 parent 8118 0 no-query no-digest
cache_peer_access 127.0.0.1 deny CONNECT

# Вышестоящий https-прокси.
cache_peer 127.0.0.2 parent 8079 0 no-query
cache_peer_access 127.0.0.2 allow CONNECT

# Разрешить (allow), запретить (deny) прямое подключение (always-direct) и для чего (ftp и др.).
always_direct allow ftp
always_direct deny all

# Не выполнять прямое подключение (never_direct), разрешить (allow) всем (all).
never_direct allow all
cache_effective_user proxy
cache_effective_group proxy

Note the selection in the config squid. It can not access the upstream proxy by the same type, so there are two 127.0.0.1 and 127.0.0.2. HTTP CONNECT method must be disabled for port 8118, and for 8079 it is allowed, so I managed to separate the different traffic.
Rules iptables:
Code:
*nat
-A PREROUTING -i br0 -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 3129
-A PREROUTING -i br0 -p tcp -m tcp --dport 443 -j REDIRECT --to-ports 3130
Works for IPv4 and IPv6. Where br0 is the local interface.
Add Thank You Quote this message in a reply
May. 12, 2018, 05:40 AM (This post was last modified: May. 12, 2018 08:36 AM by ryszardzonk.)
Post: #216
RE: ProxHTTPSProxyMII: Reloaded
@vlad_s
Thanks Thumbs Up It indeed worked. The secret lies in way squid handles ssl trafffic. You example has just basic SSL proxying

Code:
acl step1 at_step SslBump1
ssl_bump peek step1

while mine has full caching requiring Squid to do SSL certificate recreation
Code:
ssl_bump peek all
ssl_bump bump all

SSL Bump has led to ssl errors in the browsers even after adding to them certificate for Squid and ProxHTTPSProxyMII. I guess full ssl traffic caching with connection to other proxy doing its own ssl recreation is still ahead of us hence even squid 4 has number of bugs open.

One problem I have now is that my squid logs get flooded with security warnings about 5 of them a second for that site

Code:
2018/05/12 07:29:58 kid1| SECURITY ALERT: Host header forgery detected on local=172.217.20.195:443 remote=192.168.101.182:59425 FD 17 flags=33 (local IP does not match any domain IP)
2018/05/12 07:29:58 kid1| SECURITY ALERT: on URL: connectivitycheck.gstatic.com:443

I have added code prior to cache_peer settings for it to go direct and skip squid cache alltogether but without any luck.
Code:
# Problematic SSL Sites
acl sslproblems dstdomain .gstatic.com
always_direct allow sslproblems

EDIT: Those security alerts come from Chrome on Windows which does its own security checks and is not satisfied with certificates issued by ProxHTTPSProxyMII. On Android when one uses Chrome to install certificate it is accepted for whole device including other applications. Futher investigation ongoing...
EDIT2:
Search did not take to long... here and here Real question is can anything be done about that?
Add Thank You Quote this message in a reply
May. 12, 2018, 08:59 AM
Post: #217
RE: ProxHTTPSProxyMII: Reloaded
Yes, I have a basic configuration of squid, from which it is required to provide a transparent proxy. Do you want to cache encrypted traffic? As I understand from other users, it does not make sense. I modified the squid https://habr.com/sandbox/99037/ and used the keys for compilation no-verify-header and max_url 16k.
Add Thank You Quote this message in a reply
May. 12, 2018, 04:16 PM
Post: #218
RE: ProxHTTPSProxyMII: Reloaded
(May. 12, 2018 08:59 AM)vlad_s Wrote:  Do you want to cache encrypted traffic? As I understand from other users, it does not make sense.
Back in the day of unencrypted websites and 28.8/56K modems that is exactly what a Squid was for. These days as sites are more dynamic and news from the morning in the evening are no longer news as they are so yesterday and some have access to 1GB Internet speeds this is not a much of a necessity and I would take transparent proxying of Squid + ProxHTTPSProxyMII/privoxy any day over the Squid with full caching but no advert filtering. That does not mean I would not wish that would work too and there was a way for Squid and ProxHTTPSProxyMII to talk to each other nicely and use each others abilities to the greater extend.

(May. 12, 2018 08:59 AM)vlad_s Wrote:  I modified the squid https://habr.com/sandbox/99037/

Following those instructions I prepared patch for 4.0.24
Code:
diff -Naur squid-4.0.24.old/src/client_side_request.cc squid-4.0.24/src/client_side_request.cc
--- squid-4.0.24.old/src/client_side_request.cc 2018-05-12 15:25:41.749243538 +0200
+++ squid-4.0.24/src/client_side_request.cc     2018-05-12 15:32:05.666035487 +0200
@@ -522,6 +522,11 @@
     c->hostHeaderIpVerify(ia, dns);
}

+debugs(85, DBG_IMPORTANT, "SECURITY ALERT: Workaruond for " << urlCanonical(http->request) << " CONN: " << http->getConn()->clientConnection );
+http->request->flags.hostVerified = true;
+http->doCallouts();
+return;
+
void
ClientRequestContext::hostHeaderIpVerify(const ipcache_addrs* ia, const Dns::LookupDetails &dns)
{

Unfortunately that no longer works with that version
Code:
x86_64-pc-linux-gnu-g++ -DHAVE_CONFIG_H -DDEFAULT_CONFIG_FILE=\"/etc/squid/squid.conf\" -DDEFAULT_SQUID_DATA_DIR=\"/usr/share/squid\" -DDEFAULT_SQUID_CONFIG_DIR=\"/etc/squid\"   -I.. -I../include -I../lib -I../src -I../include    -I../src    -Wall -Wpointer-arith -Wwrite-strings -Wcomments -Wshadow -Woverloaded-virtual -Wno-deprecated-register -pipe -D_REENTRANT -march=native -O2 -pipe -fgcse-sm -fgcse-las -fgcse-after-reload -ftree-vectorize -fabi-version=0 -c -o clientStream.o clientStream.cc
In file included from ../src/sbuf/SBuf.h:16:0,
                 from ../src/anyp/PortCfg.h:16,
                 from ../src/AccessLogEntry.h:12,
                 from acl/FilledChecklist.h:12,
                 from client_side_request.cc:20:
../src/Debug.h:122:4: error: expected unqualified-id before ‘do’
    do { \
    ^
client_side_request.cc:525:1: note: in expansion of macro ‘debugs’
debugs(85, DBG_IMPORTANT, "SECURITY ALERT: Workaruond for " << urlCanonical(http->request) << " CONN: " << http->getConn()->clientConnection );
^~~~~~
../src/Debug.h:133:6: error: expected unqualified-id before ‘while’
    } while (/*CONSTCOND*/ 0)
      ^
client_side_request.cc:525:1: note: in expansion of macro ‘debugs’
debugs(85, DBG_IMPORTANT, "SECURITY ALERT: Workaruond for " << urlCanonical(http->request) << " CONN: " << http->getConn()->clientConnection );
^~~~~~
client_side_request.cc:526:1: error: ‘http’ does not name a type; did you mean ‘Http1’?
http->request->flags.hostVerified = true;
^~~~
Http1
client_side_request.cc:527:1: error: ‘http’ does not name a type; did you mean ‘Http1’?
http->doCallouts();
^~~~
Http1
client_side_request.cc:528:1: error: expected unqualified-id before ‘return’
return;
^~~~~~
cc1plus: warning: unrecognized command line option ‘-Wno-deprecated-register

(May. 12, 2018 08:59 AM)vlad_s Wrote:  and used the keys for compilation no-verify-header and max_url 16k.
Please elaborate. I did not really understood that last part.
Add Thank You Quote this message in a reply
May. 12, 2018, 04:28 PM
Post: #219
RE: ProxHTTPSProxyMII: Reloaded
(May. 12, 2018 05:40 AM)ryszardzonk Wrote:  EDIT: Those security alerts come from Chrome on Windows which does its own security checks and is not satisfied with certificates issued by ProxHTTPSProxyMII.

How do I replicate?
To check Chrome I used https://portableapps.com/apps/internet/google_chrome_portable which shows green lock and 'certificate valid issued by ProxHTTPSProxy CA' etc...
Add Thank You Quote this message in a reply
May. 13, 2018, 06:16 AM (This post was last modified: May. 15, 2018 05:10 AM by ryszardzonk.)
Post: #220
RE: ProxHTTPSProxyMII: Reloaded
(May. 12, 2018 04:28 PM)JJoe Wrote:  
(May. 12, 2018 05:40 AM)ryszardzonk Wrote:  EDIT: Those security alerts come from Chrome on Windows which does its own security checks and is not satisfied with certificates issued by ProxHTTPSProxyMII.

How do I replicate?
To check Chrome I used https://portableapps.com/apps/internet/google_chrome_portable which shows green lock and 'certificate valid issued by ProxHTTPSProxy CA' etc...
What I have said has been right but only partially. The issues appeared the same in Chrome and in Edge, but not on Firefox on Windows 10. Everything was due to certificate installation which behave differently on those programs.

What I did is:
I do have apache running I edited CA.crt file to remove private key from it and placed it on the local www site. Then when clicking in Firefox (both Windows and Linux) 192.168.1.1/CA.crt it properly installed for that browser. It went fine also for Chrome in Android. Chrome and Edge on Windows 10 however did not use internal browser repository, but used systems Certificate Installation creator. According to default settings used (Automatically select the certificate store based on the type of certificate) the creator had everything installed properly. I do not know if system deleted it, misplaced or did not used, but the result has been those security checks and certificate verification errors.

To fix it I installed it not in the default store by clicking the certificate, but with this steps http://community.lightspeedsystems.com/d...indows-10/ ProxHTTPSProxyMII certificate authority all the sudden started working for both Edge and Chrome.

EDIT:
It turned out that my biggest problem running ProxHTTPSProxyMII was that my serwer and my client machine where running with unsynchronized clocks and my client's clock was behind servers by about 35 seconds. That have led to the number of logs showing every single time new website was reached for browsing
Quote:[SSL: SSLV3_ALERT_BAD_CERTIFICATE] sslv3 alert bad certificate (_ssl.c:2091)" while trying to establish local SSL tunnel for [younameit.com:443]
and warnings about improper certificate for the website in the client's web browsers. Looking closely at one of the warnings I noticed that it was certificate that was created by ProxHTTPSProxyMII which few seconds later without me doing anything got accepted. Why? It was according to my client created certificate was from the future therefore not yet valid...
Add Thank You Quote this message in a reply
May. 14, 2018, 09:15 PM
Post: #221
RE: ProxHTTPSProxyMII: Reloaded
That article was related to squid 3.5.X, fixes errors "SECURITY ALERT: Host header forgery detected on..."
At the expense of other issues, I compiled the packages and installed them and unfortunately did not describe what changed there, only in the name of the archive of these deb packages indicated the compilation keys. I need to see the "rules" file, it's just in the virtual machine. Later I'll see.
max_url - almost nothing does not affect, but when you edit the file user.action, the squid can give an error when saving this file.
Add Thank You Quote this message in a reply
May. 15, 2018, 04:07 AM
Post: #222
RE: ProxHTTPSProxyMII: Reloaded
(May. 13, 2018 06:16 AM)ryszardzonk Wrote:  EDIT:
It turned out that my biggest problem running ProxHTTPSProxyMII was that my server and my client machine where running with unsynchronized clocks

Good catch! I appreciate the effort. Probably belongs in a FAQ.
Add Thank You Quote this message in a reply
May. 15, 2018, 08:45 AM (This post was last modified: May. 15, 2018 06:54 PM by ryszardzonk.)
Post: #223
RE: ProxHTTPSProxyMII: Reloaded
(May. 14, 2018 09:15 PM)vlad_s Wrote:  That article was related to squid 3.5.X, fixes errors "SECURITY ALERT: Host header forgery detected on..."
Yeah since than I decided to use squid-4.0.24-20180410 which is most recent available (seems less problematic than stock 4.0.24). After all it present config Squid does not do any certificate validity checking leaving it all to ProxHTTPSProxyMII nor mangles encrypted traffic with ssl_bump therefore breaking TLS is less likely and experimental patches not all that needed.

(May. 15, 2018 04:07 AM)JJoe Wrote:  
(May. 13, 2018 06:16 AM)ryszardzonk Wrote:  EDIT:
It turned out that my biggest problem running ProxHTTPSProxyMII was that my server and my client machine where running with unsynchronized clocks

Good catch! I appreciate the effort. Probably belongs in a FAQ.
There is more staff that might be useful for others if You want Wink

I have prepared installation scripts for Gentoo Linux which prepare whole chain to have ProxHTTPSProxyMII in transparent proxy mode thanks to squid
- adblock2privoxy - program converting any adblock filter into one understood by privoxy (I hope some one will step up some day and update that PCRE version understood by privoxy so there would be no need for that step)
- squid - installation script for transparent proxy in version 4.0.24+ not yet available in Portage (Gentoo's package system)
- ProxHTTPSProxyMII - You know that one :P As for script itself python packages should install somewhat differently in Gentoo to be available for all versions of python which I do not know how to do, but the script is good enough and only thing it is missing is creating new user in system to have it running not as root. I might add that some time in the future.

For now most problematic with using network wide transparent proxy and ssl filtering is Google which changed default policy to not accepting locally issued CA certificates and those android apps simply stopped working in most cases. Adding information to FAQ about need to have your phone rooted or have Magisk installed which would allow CA.crt be system accepted without rooting the phone might also be welcomed

PS adblock2privoxy installation script might be added into haskell overlay

EDIT: Worth to note is also that Google Chromecast refuses to work with local CA. There is no option in the Chromecast application to setup proxy nor to accept locally issued CA.
Add Thank You Quote this message in a reply
May. 16, 2018, 04:28 PM (This post was last modified: May. 16, 2018 04:29 PM by vlad_s.)
Post: #224
RE: ProxHTTPSProxyMII: Reloaded
(Apr. 09, 2018 12:57 AM)JJoe Wrote:  Try https://curl.haxx.se/docs/caextract.html for a more current file.

Use a browser to capture and export the certificate in pem format. Use an editor (notepad) to add the cert to 'cacert.pem'.

How can I extract the public key? The certificate does not help with the link. I need to browse the site of https://uslugi.tatarstan.ru/, the browser Waterfox 55 does not, say SEC_ERROR_UNKNOWN_ISSUER, IE does not retrieve the key (the button is gray), MS Edge does not know how or I do not know how.
Add Thank You Quote this message in a reply
May. 16, 2018, 04:49 PM (This post was last modified: May. 16, 2018 04:58 PM by vlad_s.)
Post: #225
RE: ProxHTTPSProxyMII: Reloaded
I found a certificate, here it is: https://www.tbs-certificates.co.uk/FAQ/e..._2018.html
Strange, but even fresh Mozilla Firefox 59 gives an error on this site (link above).
Add Thank You Quote this message in a reply
Post Reply 


Forum Jump: