Network folder synced to OneDrive/SharePoint

SharePoint synchronization mechanism using in background groove.exe does a lot to block synchronization to a network folder.

Whilst this might be well explained by why would one sync a network location such as SharePoint to another network folder, there are situations where it is required. One of such requirements is when system is running in VM and the data needs to be synchronized to VM shared folder which is then visible to operating system as network drive, i.e. then available to different VMs to avoid waste of space if that folder would be put on “C:” drive.

The workaround which works pretty well is to us symbolic link. Before doing so, make sure to close any instance of groove.exe or any other software using data in the synchronized folder.

Should you have your folders synchronized already the steps are:

  1. Stop all groove.exe processes.
  2. Rename existing folder, in example below <Company Name Team>. If OneDrive folder was selected to be c:\OneDrive it will be c:\OneDrive\<Company Name Team> and there will be another personal OneDrive folder.
    If this worked fine it means that all programs were closed properly, otherwise error would be raised.
  3. Open CLI with Administrator privileges
  4. Re-run OneDrive and/or Groove to verify that all has been recognized properly.

This should be as simple as that.

Important: synchronization of private folder on SharePoint requires newer OneDrive version than the one for shared folders and this one has additional check which does not seem to accept above trick.

Kodi and ssl_bump Squid

UPDATE: 2017-11-25 Kodi changed modules and how Certs are checked (certifi and schism)

Friend of mine, happy user of freshly baked private DLP based on Squid and ssl_bump, have quickly realized that to update his add-ons he has to connect to avoid ssl_bump based proxy.

Throughout checking shown that Kodi uses Python libraries with local certificates and trusted Certificate Authorities (at least on Windows). Troubleshooting did lead to:

OLD: before 2017-11-25

NEW:

AND

All what was needed was to add his Root CA cert content at the end of the file (crt format):

Afterwards everything came back to normal.

Log all request details on Squid

Sometimes it is required to log all request details on Squid, i.e. when you need to figure out details to write your own URL rewriter to optimize videocache or other statistics

By defailt squid strips details after “?” and all what we need is to tern it of.

strip_query_terms off

More details at http://www.squid-cache.org/Doc/config/strip_query_terms/

 

Chaining Squid URL rewriters – custom URL rewriter chained with SquidGuard

Below describes how to get URL filtering based on SquidGuard and URL rewrite for quality optimization/video caching, etc. Article talks about basics and how to set up URL rewrite and later how to chain multiple URL rewriters,

Basic URL rewrite

Basic URL rewrite has been covered here https://blob.mypn.eu/get-the-resolution-right-squid-basic-url-rewrite-script/.

SquidGuard URL filtering

SquidGuard URL filtering, how to set it up and keep alive has been covered here https://blob.mypn.eu/squidguard-url-filtering/

Chain multiple URL rewriters

Squid, at least on v3.5 does allow to define only single  url_rewrite_program which causes set of implications.  The main disadvantage is that without use of external program it is impossible to chain multiple URL rewriters as needed in our case.

The aim is to have chained above URL bitrate rewrite as well as to use SquidGuard to filter unwanted content, i.e. advertisements (SquidGuard category adv / ads). The perfect solution would be to be able to use ACLs to i.e. direct video related domains for URL rewrite whilst the rest to SquidGuard for filtering.

Given today limitations the simplest way (read lazy) way is to use chaining script which will do the work for us. Checking online one will find http://adzapper.sourceforge.net/#download which provides two scripts: wrapzap and zapchain. These scripts were created by Cameron Simpson back in 2000/2001, so quite some time ago and are often referenced with multiple examples how to get them working.

wrapzap & zapchain

Wrapzap is used to set all environment variables however I can’t find any of them being required for our example. At the bottom of the script it calls the real zapchain with selected URL filters

Regardless of number of tries I could not get this working despite online reports suggesting that it should just work. Other tested option was to run zapchain directly from squid.conf with selected filters.

The main difference was probably due to the way URL rewriters were supposed to work vs nowadays. In the past it seems like rewriter would output just the new URL whilst modern implementation expects syntax:

This required additional parsing and rewrite of original zapchain script to deal with modified output.

This provided expected results and submitting output of one URL rewriter to another. My use case would rarely result in double modification of the output. The order in which rewriters are called represents the hit rate of the both. Adblocker implemented with use of SquidGuard gets much more traffic and will shorten all calls to ads effectively providing less load on the second one.

Get the resolution right – Squid basic URL rewrite script

Squid allows to use URL rewrite program to alter URL silently (rewrite) or preferred method to redirect to other URL. Mobile apps often rely on data retrieved from URL whilst at the same time not supporting re-directions (i.e. web TV/Movie platforms).

Int the simplest form the rewrite configuration could look like:

What it does it that for all calls for domains defined as part of rewrite_quality acl, in this case  .some.cdn.network.inexstent  it would pass the URL through the rewrite-script.pl.

Squid launches the defined script upon start (depending on number of children – url_rewrite_children ) and passes requests to script STDIN.

The full description of request as passed by Squid is:

Further described at http://wiki.squid-cache.org/Features/AddonHelpers. The example request passed to the script looks like:

Additional detals around url_rewrite_program  can be found at http://www.squid-cache.org/Doc/config/url_rewrite_program/

Custom URL rewrite script

Simple URL rewrite script could be as following to rewrite bitrate part of URL for TVN Player / Player (aka player.pl):

The script as above needs to be then pointed within squid.conf

Motivation for above was that for unknown reasons web and mobile players were behaving differently and very bad quality was selected on some players regardless of the available bandwidth. Above forced the proper quality and has certainly a lot of drawbacks due to silent URL rewrite as it forces other clients to the same selected quality. Note that above example sets very low quality for tests.

SquidGuard – URL filtering

SquidGuard is one of very well known URL filtering solutions. Paired together with some good URL/domain list is very powerful and fast solution.

SquidGuard installation is very simple and well described on internet.

Example squidGuard.conf  could look like:

By default config does not include the dest sections.

To generate one as no other list/script could be easily found, below was quickly written:

The minor problem with the script is that it generates incorrect lines generating error at SqudiGuard level for parent folders with subcategories. But you’ll need to run this script once only. Should one find better way to get it done, let me know pls.

One of well known, updated and free to use for private purpose is Shalla list.

Automated list update process could be described with below. Please note to not run it more often than every 24h as per request from Shalla guys as the list is not updated more often.

Script can be then linked to /etc/cron.daily  folder:

Links:
  1. http://terminal28.com/how-to-install-and-configure-squid-proxy-server-clamav-squidclamav-c-icap-server-debian-linux/
  2. https://calomel.org/squid_adservers.html
  3. http://www.kernel-panic.it/openbsd/proxy/proxy6.html
  4. https://help.ubuntu.com/community/SquidGuard
  5. https://www.cyberciti.biz/faq/squidguard-web-filter-block-websites/
  6. http://wiki.squid-cache.org/ConfigExamples/ContentAdaptation/C-ICAP
  7. http://dansguardian.org/
  8. http://thejimmahknows.com/network-adblocking-using-squid-squidguard-and-iptables/?doing_wp_cron=1492274530.4266140460968017578125
  9. https://forum.pfsense.org/index.php?topic=72528.0
  10. https://github.com/diladele/docker-websafety
  11. http://www.squidguard.org/Doc/extended.html
  12. http://www.tecmint.com/configure-squidguard-for-squid-proxy/
  13. http://adzapper.sourceforge.net/

Thumbs on tunnels – terminate tunnel and check content – DLP

The past was easy peasy, not the case anymore these days. HTTPS was rarely used and only where it really needed to be. Anyone on the wire could see what’s going on. And yeah, it meant literally everyone.

Was it good? No, not at all. Everyone could track anyone, collect information, behavior, etc. Result? Everyone started to move towards HTTPS (ssl then tls). This basically means couple of good and bad outcomes. Squid cache ration went down, and I mean very, very low these days. Most connections handled by Squid these days are tunnels (CONNECT) and anything can be sent through such tunnel, not only private data, credit card numbers, etc. but also ads and virus (sic!).

This created specific need to be able to check the content of the stream, especially in enterprise environment. All the DLP systems can only work on streams if they can check the payload, decrypted traffic which means they need to be so well known man-in-the-middle. This means to be tunnel end-point from client side and initiation of new tunnel to server. Only this allows to inspect the traffic.

At the same time one could ask, hey how about certificates? The long winded answer can be found elsewhere, but short answer is that in enterprise environment there is at least one Root Certificate Authority (CA) and Intermediate Certificate Authority can be created. The root or intermediate CA (called CA afterwards) together with key file is uploaded to the DLP system allowing it to generate new certificates on the fly for terminated tunnels.

The CA is already trusted in enterprise environment as Root CA certificate is added to Trusted CA key ring on client host as part of operating system deployment package or domain connection.

Since client trusts Root CA it automatically trusts certificates signed by the given CA – this is how Public Key Infrastructure works. Additionally the DLP system usually tries to mimic all original certificate parameters and only CA related details are different. But people rarely check details of certificate if everything is in green and no popup/error is raised.

private DLP

With all above being said one can have their own DLP system based on Squid. This “little piece of software” is great in handling huge loads, caching data and calling others for help. This is what we’re going to do today.

Squid to terminate SSL connection uses ssl_bump functionality. The Ubunty 16.04 LTS default package does not allow to use this great functionality hence we need to start with little configuration.

Let’s get sources and all needed libraries (for all below any sudo call is skipped as it just makes the output longer and you, dear reader certainly know how to use sudo).

then we need to apply patch to enable ssl as needed

To apply patch, use your standard methodology patch -p<level> < diff-file.patch

Afterward squid package and any missing packages need to be installed

Certificate generation

First of all we need to have folder where we will store it all

 

The certificates mea:

Depending on client system certificate import will look differently. The good idea might be to place it on some easily accessible server, i.e. local wpad system.

Once certificate is downloaded it should be installed. On windows this can be done with:

Some apps do not respect Operating System level certificates, but most will. Some apps might need to be restarted it shouldn’t be required to restart full system, but who knows what type of ancient system one can be using?

All the new certificates generated on the fly need to be stored somewhere. The folder needs to be created:

And update squid.conf  (not all SSL cert ERROR related flags should be as below, tune it up to your needs).

Create set of files under /etc/squid/ssl_bump:

  1. sslBumpnet – subnets/hosts which we will bump, can be selective if needed
  2. sslnoBumpnet – these subnets/hosts we won’t bump (see logical construction above and tune it to your needs
  3. sslnoBumpnetdst – these domains/servers we won’t bump
  4. sslnoBumpSites

Example sslnoBumpnetdst as it might be tricky to set it up at the beginning. Some apps have built-in certificates and verify connection against them, i.e. Google Play store and couple of others.

Interesting was to find out what banks are doing with data and identify as in one case at least that for stats reasons were sending out GET request with bank account details, including saldo and transaction details to on-line stats agency. This was against any bank standards and bank should be prosecuted due to this. This was only possible after terminating tunnels as otherwise GET wasn’t visible. This can only be acceptable for tests and for your own private use at home in the isolated lab (as always).

Below list was fine tuned based on tests and was longer earlier.

The last point is to restart squid

To test it, select port 3127 as proxy. Once all tested transparent interception (don’t forget to have CA trusted on client side).

 

For non-Android based systems proxy.pac/wpad.dat file could be created

To get that working the wpad.yourdomain and wpad host needs to be resolved to your web server and thw wpad.dat/proxy.pac file needs to be in root folder.

Android platform is prone to not process this file (at least as for April 2017 and manual proxy settings in WiFi section need to be set.

This should all work… but hey, problems are expected.

  1. The usual problem is that CA is not trusted.
  2. Client/app has it’s own CA list and does not trust operating system level list. This required adding CA to this trusted app ring.
  3. Perl/Python based apps might have their local ssl and root cert trusted ring, see: Kodi and ssl_bump Squid.
  4. To troubleshoot full logging is often useful, see Log all request details on Squid

 

Links and reads

  1. Diladele non-free:
    https://docs.diladele.com/administrator_guide_4_0/installation_and_removal/install_on_ubuntu.html
    https://www.diladele.com/solution.html
  2. ClamAV & SquidClamAV
    http://terminal28.com/how-to-install-and-configure-squid-proxy-server-clamav-squidclamav-c-icap-server-debian-linux/
    http://squidclamav.darold.net/install.html
  3. http://wiki.squid-cache.org/ConfigExamples/Intercept/SslBumpExplicit
  4. http://wiki.squid-cache.org/Features/SslPeekAndSplice
  5. http://www.squid-cache.org/Doc/config/ssl_bump/
  6. http://wiki.squid-cache.org/Features/DynamicSslCert
  7. http://ubuntuserverguide.com/2013/12/how-to-filter-https-traffic-with-squid-3-on-ubuntu-server-13-10.htmlhttps://forums.kali.org/showthread.php?23036-SSL-Interception-with-Squid3-(MITM)
  8. http://marek.helion.pl/install/squid.html
  9. http://thejimmahknows.com/squid-3-1-caching-proxy-with-ssl/
  10. http://www.squid-cache.org/Doc/config/acl/https://docs.diladele.com/administrator_guide_4_0/system_configuration/https_filtering/recompile_squid.htmlhttps://smoothnet.org/squid-v3-5-proxy-with-ssl-bump/

404 URL not found – permalinks and WordPress

Setting up WordPress one can run into very silly issues.

Not Found

The requested URL /test was not found on this server.


Apache/2.4.7 (Ubuntu) Server at blob.mypn.eu Port 443

The usual suspects are:

  • .htaccess – verified,
  • mod_rewrite/rewrite – verified too

Yet despite number of reviewed online recommendations nothing worked until silly mistake has been discovered the web server site definition:

Bad:

Good:

The most important part is <Directory …> where it needs to point to directory where your WordPress is installed and doesn’t go well along with “/” set.