Normally, using the content filter is preferable for whitelisting sites. This is because it is better at distinguishing groups and can have separate policies. That being said, sometimes you need to whitelist at the proxy server as WELL! Why?
In situations where your environment uses authentication, whitelisting sites at the proxy server can be required because some web applications are unaware of authentication requirements. This means that the user may properly function for his normal web traffic but an application (ie. Dropbox) may be wholly unaware of the need to even try to authenticate. This will show up in the squid log files as an TCP_DENIED request with an unassociated user from an IP address that typcially has an associated user for web browsing (See Live Monitoring of Web Traffic in Proxy and Content Filter for details on how to view the logs to make this determination.)
From your logs you will be able to determine which sites are failing to authenticate for specific users and IP addresses. You can then add the domains manually to the Access Control Lists (ACLs) of your squid configuration.
You will need to modify your squid.conf file. At the top of the entries of 'acl', add the following or similar which includes your safe sites (as determined from your logs):
acl whitelist dstdomain .example.com .dropbox.com .mybadlywrittenapp.com
Next, allow this 'whitelist' ACL to http and https passing through. Add the following to /etc/squid/squid_http_access.conf:
http_access allow whitelist http_access allow CONNECT whitelist