Recent Comments
Archives
Visitors
  • 6882This month:
  • 1721Today:
  • 14Currently online:



LeaseWeb CDN

Ghostery lists Adobe TypeKit as privacy threat

The Internet tracker blocking program Ghostery now lists Adobe TypeKit (a very popular font service) as a privacy threat. I read about this first on WUWT:

I’ve gotten a few complaints this week from some overly paranoid people that say they can’t see WUWT anymore in Firefox, but can in Safari. The problem seems to be related solely to a browser extension called “ghostery” which is somehow flagging Adobe Typekit (used to provide custom fonts on WordPress) as some sort of malware.

Ghostery is not malware blocking software (as you can read on wikipedia). It is software that protects you against tracking while surfing the web and IMHO you are not overly paranoid when you use it. In the comments somebody explains:

Font are very seductive tracking beacons. Honest people who would never consider installing a tracking beacon have no qualms about using served fonts, and there’s no difference between them. There is a lot of ignorance out there regarding data mining.

So maybe Ghostery is not listing Adobe TypeKit by accident? We see with Google Analytics that website owners are happy to pay for analytics with their visitors privacy. The same may apply to fonts (although TypeKit is not free). But before we accuse Adobe, let’s take a look at the Adobe TypeKit privacy policy:

In order to provide the Typekit service, Adobe may collect information about the fonts being served to your website. The information is used for the purposes of billing and compliance, and may include the following: …

So, one thing is for sure: Adobe TypeKit is in fact collecting data while serving fonts. This alone may be reason for Ghostery to block it. I did some research and verified that next to the font files TypeKit is loading a 1 by 1 pixel GIF image that has an URL like this:

http://p.typekit.net/p.gif?s=1&k=sgt5tia&app=&ht=tk&h=wattsupwiththat.com&f=...

In the privacy statement Adobe says they collect data “for the purposes of billing and compliance”, which seems reasonable. Also, the privacy policy has a list of data that they collect. None of the data on the list seems to be invading the privacy of the website visitor. So is this a big fuss about nothing? I’m not sure. If you pay close attention to the wording of the sentence you see that they chose to use “may include”. AFAIK “may include” does not imply “is limited to”. Also this “compliance” is not further specified. What do they need to comply with?

Can Adobe TypeKit be trusted to respect our visitors privacy? Probably they can, but even after reading their privacy policy I’m not 100% sure. What do you think? Should I take off my tin-foil hat?

Working in the cloud to prevent viruses & trojans

This post touches some of the IT security topics that modern companies may have to deal with.

Endpoint security? Problematic!

Endpoint security is the security of your company’s laptop and desktop computers. The security of these computers in the outer perimeter of the network is a hot topic. You see the problem with home users that do not have the security devices and software that companies have. Viruses that encrypt personal documents with a password and ask a ransom to release it are common. Banking trojans are widespread as there is much money to be made. But also company databases containing millions of user credentials get stolen. Even PC manufacturers turn malicious under the pressure of advertisers. They ship new laptops with self-signed root certificates that nullify the web’s security system.

BYOD policy? Unstoppable!

Today Bring-Your-Own-Device (BYOD) policies are more popular than ever as people bring their private smart-phones to work. They identify with the device and the brand of the phone. Even the color of the phone or the installed software is part of their identity. People also want to use USB sticks, USB drives and their tablets at work as it has become part of their IT vocabulary. Working remote is encouraged and devices are carried from work to home and vice versa. This causes laptops to be connected to malicious networks, get stolen or just get lost. Fingerprint scanners and full-disk encryption and hardware tokes may help a bit, but do not solve all problems.

PC or Mac? Yes, indeed!

Apple laptops (and phones) are very expensive and have become important status symbols in the workplace. Some colleagues may be lucky to get a shiny Apple laptop or phone from the boss. Others are not that privileged and try to fake their success by buying one with their own money. For phones this is fully accepted. For laptops you see that more and more companies start to allow this. Companies see less interoperability problems, because all major business applications have become browser based. This causes the importance of the choice of desktop operating system to diminish rapidly.

Laptops without viruses

When Google launched it’s ChromeBook concept in 2011 I was expecting companies to start buying these for their employees. This laptop can safely be stolen, destroyed and is (by design) not vulnerable to viruses and trojans. It is even resilient against lost data due to forgotten backups. It’s secret? The laptop does not store any data on the it’s internal hard-disk, but stores everything in the cloud. You can simply reset the laptop to factory defaults, whenever the laptop misbehaves, without losing any data. Google has also started offering complementary corporate email and calendaring solutions. I really thought they had a winner on their hands. I was wrong. Companies did not massively convert.

Super fast and secure development workstations in the cloud!

At LeaseWeb we had (and still have) VMs to do development on, but these are not setup (or fast enough) to run your graphical development tools or VM tools like vagrant or docker. I identified this problem (in 2012) and started an experiment with working fully in the cloud.

I started offering a multi-user desktop development environment for a small group of 5 developers on a single server. The dual CPU server with 64 GB ram was operated by the team’s system engineer. The advantages were great: work from any machine without having to install your development environment. Connect from work or home to the same desktop and take up where you left off. You could also easily share files on the local disks and backups were made for you on the corporate backup systems. The environment was graphical and was totally over-dimensioned and thus super fast.

It failed (for that team). The multi-user desktop environment lifted most of the complaints that existed, but developers now felt that they had less freedom (and less privacy). Apparently they did not care about the source code not leaving the company or any of the other security advantages of working in the cloud (viruses, trojans and backups).

Fast forward to today. Many developers run Linux (often with encrypted disks) on their fast i5 laptops with 8GB of RAM. They put all their work in JIRA and Git, which are both in the cloud. So I guess that there is not much to gain anymore by moving development to the cloud.

But can’t anyone work in the cloud?

Could this pattern of working in the cloud also be applied to a company’s non-development department? These departments may have access to more important (financial) information and their employees may have less IT knowledge. This may cause viruses and trojans to pose a higher risk.

You could set up some (Windows) terminal servers with Remote Desktop Protocol (RDP) and work on these machines. You could run software updates during the nights, make backups for users and lock the system down to prevent viruses and trojans. Employees could use the local browser (on their ChromeBooks) for Internet usage and a locked down remote browser for the company web applications. This way the corporate (sensitive) data should stay protected.

What do you think? Would it work? Use the comments..

Meet the LeaseWeb Development Team

working_at_leaseweb

What is it like to work at LeaseWeb? In this video, our developers share their experiences. Click the image to see the video.

Would you like to join them? There’s always room for more talent. Whether you’re a datacenter guru or a programming prodigy, there’s always a place for you at LeaseWeb. Take a look at our vacancies and find out where you fit in!

About LeaseWeb:

LeaseWeb – part of the OCOM Group – is one of the world’s largest hosting brands offering a broad portfolio of public cloud servers, private clouds, bare metal servers, colocation and CDN services, all backed by a low-latency, blended global network with over 5.0 Tbps of capacity. Currently, LeaseWeb owns and operates approximately 65,000 servers which are hosted in our data centers across Asia, Europe and the U.S.

For more information visit:
http://www.leaseweb.com

MySQL-CRUD-API now has transforms!

Last week I created a new GitHub project called “MySQL-CRUD-API” and it allows you to quickly setup a simple REST API with CRUD functionality by just adding a single “api.php” file to your project and configuring it’s database connection. Today I show how the relational support of this project works.

Supported table relations

There are three types of table relations supported:

  • BelongsTo
  • HasMany
  • HasAndBelongsToMany

Blog example

When you use the “list” command of the API it allows you to specify multiple tables. If these tables have relations (foreign keys) the output will be filtered in such a way that only relevant records are returned. This is what the API outputs when you list posts and comments filtered on a specific post:

{
    "posts": {
        "columns": [
            "id",
            "user_id",
            "category_id",
            "content"
        ],
        "records": [
            [
                "1",
                "1",
                "1",
                "blog started"
            ]
        ]
    },
    "comments": {
        "relations": {
            "post_id": "posts.id"
        },
        "columns": [
            "id",
            "post_id",
            "message"
        ],
        "records": [
            [
                "1",
                "1",
                "great"
            ],
            [
                "2",
                "1",
                "fantastic"
            ]
        ]
    }
}

Not so useful right? You would probably like to see something like this:

{
    "posts": [
        {
            "id": "1",
            "comments": [
                {
                    "id": "1",
                    "post_id": "1",
                    "message": "great"
                },
                {
                    "id": "2",
                    "post_id": "1",
                    "message": "fantastic"
                }
            ],
            "user_id": "1",
            "category_id": "1",
            "content": "blog started"
        }
    ]
}

That is exactly what the function “mysql_crud_api_transform()” does. You run this function on the client after receiving the API response. This is beneficial as it uses the CPU and RAM of the API consumer instead of that of the API server. This transformation function is implemented in PHP and JavaScript, so that you can make spiders and users with browsers equally happy!

<?php
function mysql_crud_api_transform(&$tables) {
	$getobjs = function(&$tables,$table_name,$where_index=false,$match_value=false) use (&$getobjs) {
		$objects = array();
		foreach($tables[$table_name]['records'] as $record) {
			if ($where_index===false || $record[$where_index]==$match_value) {
				$object = array();
				foreach ($tables[$table_name]['columns'] as $index=>$column) {
					$object[$column] = $record[$index];
					foreach ($tables as $relation=>$reltable) {
						foreach ($reltable['relations'] as $key=>$target) {
							if ($target == "$table_name.$column") {
								$columnidx = array_flip($reltable['columns']);
								$object[$relation] = $getobjs($tables,$relation,$columnidx[$key],$record[$index]);
							}
						}
					}
				}
				$objects[] = $object;
			}
		}
		return $objects;
	};
	$tree = array();
	foreach ($tables as $name=>$table) {
		if (!isset($table['relations'])) {
			$tree[$name] = $getobjs($tables,$name);
		}
	}
	return $tree;
}

And the JavaScript version:

function mysql_crud_api_transform(tables) {
	var array_flip = function (trans) {
		var key, tmp_ar = {};
		for (key in trans) {
			tmp_ar[trans[key]] = key;
		}
		return tmp_ar;
	};
	var get_objects = function (tables,table_name,where_index,match_value) {
		var objects = [];
		for (var record in tables[table_name]['records']) {
			record = tables[table_name]['records'][record];
			if (!where_index || record[where_index]==match_value) {
				var object = {};
				for (var index in tables[table_name]['columns']) {
					var column = tables[table_name]['columns'][index];
					object[column] = record[index];
					for (var relation in tables) {
						var reltable = tables[relation];
						for (var key in reltable['relations']) {
							var target = reltable['relations'][key];
							if (target == table_name+'.'+column) {
								column_indices = array_flip(reltable['columns']);
								object[relation] = get_objects(tables,relation,column_indices[key],record[index]);
							}
						}
					}
				}
				objects.push(object);
			}
		}
		return objects;
	};
	tree = {};
	for (var name in tables) {
		var table = tables[name];
		if (!table['relations']) {
			tree[name] = get_objects(tables,name);
		}
	}
	return tree;
}

Check out all the source code on my GitHub account: https://github.com/mevdschee/mysql-crud-api

Simple PHP REST API script for MySQL

With single page applications (or SPAs) becoming very popular very quickly we see a rising need to add APIs for everything. Most companies take a first step by creating a simple data-driven API. I wrote a PHP script that generates a simple and fast REST API from your MySQL tables with full CRUD support. Even pagination and filtering is supported! It is only 450 lines of code, not exactly rocket science, but it may be useful when you need to whip up a Minimum Viable Product (or MVP).

Limitations

  • Authentication or authorization is not included
  • Validation on API input is not included
  • Only a single database is supported

Features

  • Single PHP file, easy to deploy.
  • Very little code, easy to adapt and maintain
  • Streaming data, low memory footprint
  • Condensed JSON: first row contains field names
  • Blacklist support for tables (and columns, todo)
  • JSONP support for cross-domain requests
  • Combined requests with support for multiple table names
  • Pagination, sorting and search support
  • Relation detection and filtering on foreign keys
  • Relation “transforms” for PHP and JavaScript

Configuration

This is a single-file application. In the bottom of the file you find the configuration:

$api = new MySQL_CRUD_API(
	"localhost",                        // hostname
	"user",                             // username
	"pass",                             // password
	"db",                               // database
	false,                              // whitelist
	array("users"=>"crudl")             // blacklist
);
$api->executeCommand();

Example output

When you request the URL http://localhost/api.php/cate* you will be matching a single table (“categories”) in the configured database. The (formatted) output would be something like this:

{
    "categories": {
        "columns": [
            "id",
            "name"
        ],
        "records": [
            [
                "1",
                "Internet"
            ],
            [
                "3",
                "Web development"
            ]
        ]
    }
}

As you can see the column names are only at the start of the object and the table name is used as a key on the object allowing for multiple table matches when using a wildcard (star) in the URL.

Get it on Github!

If you want to get it check out my Github page for this little project:

https://github.com/mevdschee/mysql-crud-api

Contributions, forks and additions are more than welcome.