Tuesday, February 13, 2018

How to Create a Kid-safe YouTube Playlist

Now that I've got the Beta of Tunnel Video live, I'd like some feedback, so I'm getting the word out about some of the best use cases for this free application. While it's true that Tunnel Video is the simple alternative to video editing, it's also true that there's more to the app than creating free highlight reel playlists from YouTube videos.

free kid-safe video playlists on tunnel video

Two use cases that are perfect for Tunnel Video are easily creating replay highlights for sports teams and kid-friendly video playlists for parents. In this post, we'll dive into the second one...

As a stay-at-home dad, I'm well aware of one of the modern parenting difficulties: limiting screen time and controlling what your kids are watching. Tunnel Video highlight reels solve many issues:

  • since you only add the videos you want, you always know what your kids are watching
  • you can set the start and end time of each clip, so you also know exactly which part of each video your children are seeing
  • all of the content is streaming from YouTube, but since you're not on the YouTube site (or app), there are no comments and no recommended videos
tunnel video free highlight reel playlists for YouTube
Even the YouTube Kids app does not address these issues because suggested videos are easily accessible to little fingers. It drives my wife crazy when our kids are watching a video and they start tapping around to open up other videos. Even if you're sitting next to them, they can do it too quickly for you to stop them, and then they want to watch those other videos they discovered. In our house, it's the unboxing videos that always used to come up--that is before we created Tunnel Video playlists.

Try the free Tunnel Video app and I recommend you register (also free) so you can manage multiple highlight reels under your account.

free highlight video playlists on tunnel video

Tuesday, January 30, 2018

AvePoint Takes Fire for Unethical Marketing

Having worked in the Microsoft SharePoint space for many years, I know that it's unusual for SharePoint Independent Software Vendors (ISVs) to get much mainstream press coverage, so I was surprised to see a Washington Business article about an AvePoint marketing campaign. The article, All's fair in sales and marketing? Competitor blogs that Metalogix is for sale, tries to poach customers, highlights a shady marketing tactic recently employed by AvePoint against my former employer, Metalogix.

(Update: The Metalogix response, "Metalogix is Forever" has been posted now.)

Others have written about this dishonourable marketing move (for example: An Open Letter to AvePoint | An Unethical Marketing Campaign), and I can understand their frustration. The SharePoint partner community used to be a pretty tight-knit group focused on "coopetition." Sure, we were competing, but we also worked together to raise the profile of SharePoint--which BTW was a great strategy that paid off for everyone. As they say, 'a rising tide raises all boats.'

But that's just not the case anymore, as money poured into the situation, investors took notice and the atmosphere changed. A win at all costs mentality emerged from some companies and the whole situation was a lot less interesting for many of us who started in content management before the SharePoint era. I suspect this marketing ploy will backfire.

Sunday, December 31, 2017

Tunnel Video - Free Highlight Reel Creator for YouTube is Live!

Tunnel Video - Free Highlight Reels for YouTube

I'm thrilled to return to blogging after a crazy few months working on my web app for YouTube videos: Tunnel Video. The Beta is live so head on over and you can create free highlight reels from YouTube videos.

Tunnel Video is the simple alternative to video editing. Whether you're looking to highlight the best parts of a tutorial, put together the best moments of various videos (e.g., sports), or just create a kid-friendly playlist without comments, ads, or recommended videos--it's easy, free, and doesn't require any install. 

Once you've created your highlight reel or mashup of YouTube videos, you can quickly share the results with a public view URL. Send it via social media, email, or whatever, it's just plain easy.

Check it out! I'd appreciate any feedback.

My GitHub heat map for 2017 tells the story pretty clearly--I started the project in April.

Cawood Tunnel Video GitHub heat map

This new project is under the umbrella of my software startup: Second Marker.

Tuesday, November 14, 2017

How to View Files in a Google App Engine Docker Container

This post has been adapted (abridged) from the Google Cloud Platform (GCP) Debugging an Instance documentation. I wanted to see the file structure that was being deployed to my Angular 5 application for my YouTube video highlight web app (Tunnel Video). These are the steps required.

1. Go to your App Engine instances in the GCP console


2. Find the instance and choose the SSH option to open a terminal in a browser window

3. Once the terminal window for the instance opens, list the docker containers: $ sudo docker ps

4. Find the docker container in the list with your project name in it and then run: $ container_exec containerId /bin/bash

5. This will open a shell in your container. From there, just list your files as usual: ls app

Wednesday, October 18, 2017

Postman with Authenticated Google API HTTP Requests

Let's say I want to send an HTTP GET request to a Google API using the super helpful Postman application. That's simple enough, I'll just choose GET and enter the URL. In this example, I'll search YouTube for videos with "test" in their details.


Breaking down this URL, you'll see that the query ("q") for "test" is in there and I'm asking for only the 'snippet' details of videos to be returned ("type=video"). Without the type optional parameter, the search would also return channels.

The only cumbersome thing about this GET request is that I need a Google API Key. These keys allow Google to limit how many API requests any account is making--and potentially charge the user when the count gets too high. To get a key, you need to a Google Cloud Platform (GCP) account and then follow the instructions to create an API key from the API Manager section.

Since API keys are--by definition--limited, most people try to keep them private and restrict who can use them. There are a few different ways to add API Key Restrictions in GCP. "HTTP referrers (websites)" is a popular and straightforward option. For example, if you were using the key from a local website, you could add http://localhost:/* to the list.

Note: If you don't care about your key being used by others, then you can leave the key with no restrictions and skip the next part.

In Postman, you can pretend you're sending the request from a local website by adding a "Referer" header entry. However, since it's a restricted header, there is an extra step. You must turn on Postman Interceptor by clicking the Interceptor button at the top of the window (next to the sync button). If you have interceptor off, the Referer header entry will be ignored and you'll get an error: "Error 403:The request did not specify any referer. Please ensure that the client is sending referer or use the API Console to remove the referer restrictions."

Now you're sending the GET request with an API key and you're getting back JSON results that look like this:

    "kind": "youtube#searchListResponse",
    "etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/71Y1Pa_Vox_0ZzzjdbBNppwdf0s\"",
    "nextPageToken": "CAUQAA",
    "regionCode": "CA",
    "pageInfo": {
        "totalResults": 1000000,
        "resultsPerPage": 5
    "items": [
            "kind": "youtube#searchResult",
            "etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/WePlVVP0Z4fWK6zl92pA9jVLbdQ\"",
            "id": {
                "kind": "youtube#video",


For more about the Google API options for YouTube Search, refer to the YouTube developer documentation.

Just that query alone is useful, but there's still one key thing missing from our request--authentication. I'm not going to go into detail about OAuth authentication here--you can read about that elsewhere--so this section will just follow the basic steps. To use the Google APIs as an authenticated user, you need an OAuth token.

There are a few ways to get a Google OAuth token. Probably the simplest option is to use Google's own OAuth playground. This is a super useful app that allows you to fiddle with all sorts of settings.

Another option is to use Postman's built in Get New Access Token feature. To use it, click on Authorization (next to the Headers tab) and then the "Get New Access Token" button. Here you would enter the Auth URL as https://accounts.google.com/o/oauth2/auth and the Access Token URL as https://accounts.google.com/o/oauth2/token. You also need to know the Scope for your request--which you probably just want to go to Google's OAuth Playground to get anyway.

Once you have an access token, a simple method to check its validity is to paste this URL into a browser address bar: https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=

Note: Checking the token will show its expiration time and also its scopes.

Tuesday, September 19, 2017

Google Cloud Platform for a Full-stack Angular Web Application

At first blush, getting a complete Angular web application deployed and running on the Google Cloud Platform (GCP) can be a daunting task. Each quickstart tutorial is reasonable enough, but if you read ahead through all of the various documentation pages you'll need, they can soon begin to snowball.

UPDATE: After I started writing this post, I discovered a Google lab that covers the same topic: Build a Node.js & Angular Web App Using Google Cloud Platform.

I'll just quickly provide an overview of deploying Angular as the front-end app (on Google App Engine) and MySQL with a Node.js API on the backend. MySQL will run on Cloud SQL and Node.js runs on App Engine.

The basic steps involved are:

1. Set up a GCP account and project

2. Download the Angular sample and install the requirements

3. Deploy your Angular front-end app to GCP

4. Set up a MySQL DB on Cloud SQL. I wrote a whole post about setting on MySQL on Google Cloud SQL.

5. Complete the tutorial for using Google Cloud with Node.js

Since these samples aren't linked, you'll have to test the pieces separately until you develop some interaction in your Angular app. You can use Postman to test your Node.js API.

Monday, August 14, 2017

Set up MySQL on Google Cloud Platform

Here's a walkthrough of how to set up MySQL on Google Cloud Platform's (GCP) Cloud SQL service.

1. Create a Google account if you don't have one: https://accounts.google.com

2. Sign up for a free trial on: https://console.cloud.google.com

3. Create a GCP project to house your MySQL instance.

4. Create a Cloud SQL instance--obviously, choosing MySQL as the DB type.

5. Create a database in the instance by going to the Databases tab under the Instance. For this post, I'll create one called "test".

6. To add a schema to your DB, you can use the Cloud SQL command line, https://dbdesigner.net or MySQL Workbench. Of course, if you know SQL, you can just type out your schema, but if you want a visual representation of your DB, these tools are handy.

I like the simplicity and no-install aspect of dbdesginer.net, but I ran into a bug, so I don't use it as much as I would otherwise. Hopefully, they'll fix it soon. Until then, I guess I have to recommend MySQL Workbench since it's great in many ways.

7. If you just want to quickly test that your DB is working. You can run some simple queries in the Google Cloud Shell. To connect to your MySQL instance, open the shell (little command line icon in the top-right of the GCP dashboard.) and type in these commands:

googleusername@projectId:~$ gcloud beta sql connect dbinstance --user=dbusername (usually root for username)

You'll be prompted for your password and then you'll see the MySQL prompt. Note that it might take a few seconds before it appears.

Whitelisting your IP for incoming connection for 5 minutes...\
Enter password:

Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 1234
Server version: 5.7.14-google-log (Google)
Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

8. To run your test queries, you'll need to select your DB within your instance.

mysql>; use test
Database changed

Once you've selected the DB, you can run a simple operation to create a table and then use SHOW TABLES; to see the results.

The command I'll use to create a simple table is:

CREATE TABLE `testtable` (

This creates a new table with one column called "id".

And this is what it looks like when I paste this command into the Cloud Shell:

mysql>; CREATE TABLE `testtable` (
    >; PRIMARY KEY (`id`)
    >; );
Query OK, 0 rows affected (0.03 sec)
mysql>; show tables;
| Tables_in_test |
| testtable      |
1 row in set (0.00 sec)

9. Now that you have a working DB, you'll want to connect to it from code and possibly from a MySQL client (e.g., the 'dolphin' a.k.a. MySQL Workbench).

To get the info you need, go to the Instance Details page in GCP and look at the Properties under the Overview tab. What you need is the "IPv4 address". Grab that, but you won't be able to connect to your instance yet. First, you have to allow connections.

To allow your client to connect to your DB in the Google cloud, you need to open Authorized Networks and add your network IP. To get your client IP, type "what's my IP" into Google.

10. Now you can access your DB from a MySQL client, but what about a local dev instance of a GCP App Engine application? For that you need a Google Cloud SQL Proxy running.

NOTE: The default setting is for all apps within the same project to be authorized, if you're using the DB from a different GCP project, you'll have to authorize it explicitly.

Without the Cloud SQL Proxy running, your local dev instance of your app will return the error: Error: connect ECONNREFUSED xxx.0.0.xxx:3306.

You'll need to install the Google Cloud SDK (configuration will have you select the project). If you have multiple environments [e.g., test, dev, prod, etc. You may have to change this as some point with this command: cawood$ gcloud config set project projectName]), and enable the Cloud SQL API, if you haven't already. You'll also need a credential file to supply to the proxy. To get this file, create a service account in the IAM & Admin section of GCP. Make sure you grant the account rights to Cloud SQL. When you create the account, you'll be able to create a new private key and download a JSON credential file. The Cloud Proxy will read this credential file.


11. Once you have everything ready, here is the command to start the Cloud Proxy:
./cloud_sql_proxy -instances=instanceConnectionName=tcp:3306 \ -credential_file=credentials.json &

If all goes well, the Cloud Proxy command will return this output:

cawood$ 2017/04/23 22:34:39 Listening on for instanceConnectionName
2017/04/23 22:34:39 Ready for new connections

FYI: From the command line help:
  * On Google Compute Engine, the default service account is used.
    The Cloud SQL API must be enabled for the VM.

  * When gcloud is installed on the local machine, the "active account" is used
    for authentication. Run 'gcloud auth list' to see which accounts are
    installed on your local machine and 'gcloud config list account' to view
    the active account.

  * To configure the proxy using a service account, pass the -credential_file
    flag or set the GOOGLE_APPLICATION_CREDENTIALS environment variable. This
    will override gcloud or GCE credentials (if they exist).

Tuesday, July 11, 2017

Second Marker Software

My next project has begun. I’m working on a software startup called Second Marker.

Second Marker Software

Our first solution—and the inspiration for starting the company—is a free and easy alternative to video editing: Tunnel Video. If you want to make a highlight reel of YouTube videos, this will be the simplest solution.

There are endless possibilities for why someone would want to create and share a playlist of YouTube video with start and end times, but some use cases include:

Update: try the Beta for free at http://www.tunnelvideo.com!

Wednesday, June 21, 2017

Google Cloud SQL Proxy on Same Machine as MySQL

As a web developer, it makes sense that you'd want to connect to my Google Cloud SQL instances running in Google Cloud Platform (GCP) via Cloud SQL Proxy and also have MySQL running locally on your machine. Unfortunately, the default configurations for these two systems will cause an error when you try to run the proxy: "bind: address already in use."

One clear reason to run the proxy is that it allows you to securely connect to your Cloud SQL instances with MySQL Workbench. You can get the IP (hostname) and port from GCP no problem, but to connect, you need to be running Google's proxy. (You also need to create a JSON credential file from GCP and add your IP to the Authorized Networks BTW.)

The reason for the error is that both the proxy and the local default install of MySQL will try to communicate on port 3306. When you try to start the proxy, this is the result:

doom:project cawood$ ./cloud_sql_proxy -instances=instancename-9999:us-west1:instance=tcp:3306 -credential_file=CredFileName-dflkjal.json &

doom:project cawood$ using credential file for authentication; email=test@test.999999.iam.gserviceaccount.com
2017/06/05 23:51:24 listen tcp bind: address already in use

To resolve this error, one solution is to create a config file for the local MySQL server and change the port.

One thing to note (as per the MySQL documentation) is that the installer for macOS will not create a config file: "Note As of MySQL 5.7.18, my-default.cnf is no longer included in or installed by distribution packages." If you don't have a config file already... to create the config file, simply run these commands.

$ cd /etc
$ sudo nano my.cnf

This is all you need in the file:

port = 3366

port = 3366

Once you're done use CTRL + o to save and exit Nano.

It's worth noting, that I used to see this error before I installed MySQL locally. The reason was that there was another Cloud SQL Proxy process ("cloud_sql_proxy") already running. The solution to that is simply to kill the process.

Thursday, May 04, 2017

Wednesday, April 12, 2017

Node.js Authentication Error Connecting to Google Cloud SQL Proxy

I was receiving this error trying to run a local instance of a Node.js API against a Google Cloud Platform (GCP) database.

Error: ER_ACCESS_DENIED_ERROR: Access denied for user ''@'cloudsqlproxy~' (using password: NO) at Handshake.Sequence._packetToError (/Users/cawood/GitHub/project/node_modules/mysql/lib/protocol/sequences/Sequence.js:52:14) at Handshake.ErrorPacket (/Users/cawood/GitHub/project/node_modules/mysql/lib/protocol/sequences/Handshake.js:103:18) at emitOne (events.js:96:13) at Socket.emit (events.js:191:7) at readableAddChunk (_stream_readable.js:178:18) at Socket.Readable.push (_stream_readable.js:136:10) 

The solution was quite straightforward, but when I first googled the error, I couldn't find anything about the access denied error showing no user--the username should be in the error (i.e. 'username'@'cloudsqlproxy).

I thought the error was the GCP Cloud SQL Proxy, but I was initiating it correctly and with a valid credential file for a service account:

$ ./cloud_sql_proxy -instances=name-111111:us-central1:instancename=tcp:0000 \ -credential_file=serviceAccountCreds.json &

The problem was actually with the environment variables for my local instance of the Node.js API. I hadn't exported them. Of course, the error was correct, I wasn't trying to connect with any user at all.

$ export MYSQL_USER="username" 
$ export MYSQL_PASSWORD="password" 
$ export MYSQL_DATABASE="test" 
$ npm start

To check if you have set these correctly, you can output them to the console when you start the server: console.log(config);

cawood$ npm start 
> projectapi@0.0.1 start /Users/cawood/GitHub/project
> node server.js 
 { user: 'username', password: 'password', database: 'test' } 
App listening on port 8000 Press Ctrl+C to quit.

Friday, March 10, 2017

Google Acquires AppBridge Software

At the Google Cloud Next conference yesterday it was announced that Google has acquired AppBridge. I've been at AppBridge for just over a year and I'm thrilled to be a part of this Vancouver software success story.

AppBridge is a tremendous addition to the Google solutions because the founders designed the AppBridge Transformation Suite from the ground up for performance and the cloud. AppBridge is already handling the world's largest G Suite and Google Drive migrations.

Here is a quote from Google's announcement:

"Migrating to the cloud can be complex. It's not just your files that need to be moved; permissions also need to map correctly; content likely needs to be reorganized, and some data probably needs to be archived. To address that challenge, today we are announcing the acquisition of AppBridge, an enterprise-grade, G Suite migration tool that helps organizations seamlessly migrate from their on-prem, cloud-based and hybrid solutions to Google Drive.

With AppBridge, your organization can migrate files effortlessly to G Suite from your existing file servers or content management systems like SharePoint, or from many other cloud platforms you might be using. File permissions are also brought over when you migrate, which means your team's file access remains unchanged and your data stays safe. We’re working together with AppBridge to bring them into the G Suite team. Stay tuned for more information in the near future."

As for me, I'm going to take some time to think about what I want for my next adventure, but this one was certainly memorable.

TechCrunch Article: Google acquires AppBridge to help enterprises move their files to its cloud services

Wednesday, February 15, 2017

Google Sheets Split into Columns

I'm very much appreciating Google Sheets' split into columns feature. Ya, I know, it's not like it hasn't been done before, but man it's nice to have when you need it.

To split text into columns using separator options (including space, comma, custom, etc.):
  1. Open a spreadsheet in Google Sheets.
  2. Paste the data you want to split into columns.
  3. In the bottom right corner of your data, click the Paste icon.
  4. Click Split text to columns. Your data will split into different columns.
  5. To change the delimiter, in the separator box, click Comma to open the dropdown of options.

Wednesday, January 18, 2017

Building Video Games with My Daughter and Lego

I asked my (then) 4-year-old daughter if she wanted to make a video game, and her response was exactly what I expected, "We can't make a game." "Actually, " I said, "we can! Want to do it?" She emphatically said yes, so we set out to make an iOS game. Of course, I had already done some playing around with Unity 3D, so I knew that a 2D game wouldn't be too much work.

I wanted her to have the best experience possible, so I devised a plan to take her ideas and convert them into the game without changing much along the way. I used a sample 2D game as a starting point so we wouldn't have to spend too much time writing code before she saw her creations in a working game. Note that if you don't deploy your app to the store, you can do all of this for free (plus the cost of Lego of course).

We started by drawing out the main elements of the game by hand. Here are the villain and hero characters as my daughter drew them.

After that, it was off to the Lego store so we could get enough 2x2 pieces to build whatever we wanted. Here's the villain Lego version my daughter created from her original drawing.

We designed the rest of the game elements including a tree and an egg, and then I used photos of the Lego concepts to convert her creations to simple sprite versions. Here's another character before reducing the pixels down to a simple sprite.

My daughter building Lego versions of the game sprites.

This is a screenshot from the first build I deployed to my iPhone (just for fun, I threw in a photo of my daughter as the player). Not bad for a 4-year-old with some Lego!

We had a great time on this project and I was promoted to write this post when she recently asked if we could play the game again.

Wednesday, December 28, 2016

How GitHub Shows My Second Child Arrived

I happened to notice this funny visual representation of the 2015 birth of my son on my GitHub profile. When anyone asks me questions about having kids, I'll have to show them this.

I'm pasting in a screenshot of my profile page below. The heat map is pretty clear about my son arriving in September. There's one outlier green square after his arrival--which I assume was me just committing every change I had made before his birth.

Tuesday, November 01, 2016

BIA Woman Indiegogo Campaign is Live!

Two of my friends have set out to create the perfect pants for female athletes. Today they launched their Indiegogo campaign for BIA Woman's perfect pants. Check it out and help support female athletes everywhere!

biawoman perfect pants

"We took it to other athletes – our clients, training partners, friends, and competitors. We listened to the problems they were struggling with and then asked ourselves the most important question of all: How can we make a difference?
BIA WOMAN ATHLETICS was born. We partnered with a team of ingenious fabric and design experts, recruited high-performance women athletes across multiple disciplines to test our products and came up with the best line of workout clothing designed by and for athletic women."

Saturday, October 15, 2016

My Blog's Most Popular Search Term: Microsoft Bob

This goes into the wacky Internet category. I just happened to glance at the "Search Keywords" stats for my blog in the Blogger admin console and noticed the single most popular search term (since I started the blog around 2005) that has led people to the geeklit blog is "Microsoft Bob."

What's possibly even more bizarre is that number two is "barcelona aquarium." Really? Out of all the pages out there, people are searching for an aquarium and thinking, "ya, geeklit sounds like the right site for that."

Very strange people...

Sunday, September 18, 2016

Why I Just Bought a MacBook Pro

I wrote a post in 2014 entitled, Why I Just Bought a Chromebook, so it may seem odd that I'm now writing this post. To be clear, I'm not backing off my favourable position on Chromebooks at all. They're great and they can be a lot more affordable than other laptops... like the MacBook Pro, for example. MacBooks are overpriced; I feel that's a pretty defensible comment. So why would I buy one?

The answer is actually quite simple and surprisingly non-technical. I have spent my entire career working in software, but I've never bought an Apple computer of any type. I've been using my wife's old MacBook for years, but I've never experienced the best that Apple has to offer and I always wondered what I was missing. If you combine that with the current popularity of Macs--especially amongst web developers--and I just decided it was time. I've run many versions of Windows. I've used FreeBSD and some flavours of Linux (including Ubuntu on my Chromebook), but it's not until now that I can say that I've really had the full Mac experience.

Unfortunately, this version of the MacBook is missing some of the hardware niceties that I enjoyed in the older MacBook I used. For example, the MagSafe connector for the power cord is gone, there is no SD card reader, no light to show the battery is charging, and no external lights to show how much of the battery remains. Those were all useful. Most people have heard about the lack of ports on this new laptop--I've given a nod to that controversy by including a dongle in the image above. Almost everything requires a dongle which is a pain. One notable exception is the headphone jack. Until the iPhone 7, the MacBook Pro actually does have a headphone jack which is nice. I'll have to see how much of an issue this is (or isn't) for me. So far, I've only purchased one dongle--for USB devices. That will cover a lot for me. If I really have issues, I'll get one of the third-party adaptors that adds ports to the machine. (Update: I saw a pretty cool "HyperDrive" one that adds an SD card reader, USB port and HDMI port.)

Of course, there are lots of improvements as well. The screen is much better, the battery life is improved and everything just generally runs faster on the newer processor and RAM. I also really like the keyboard. It has a very satisfying click which reminds me of a mechanical keyboard.

I haven't had the machine very long, so this isn't a proper review. At this point, I'm feeling good about the purchase. Yes, I do miss some MacBook features, but the Pro machine is much faster than my old one and allows me to project to Apple TV. The other one was too old to offer that feature--which gives you an idea of just how big a step us this was in hardware. 

Sunday, August 14, 2016

Angular 2 ng serve or ng build Permissions Error

Update: If you're running macOS, you should just use Homebrew from the start and you'll likely avoid these issues.

If you grab an Angular 2 project from the web, you might find that you run into permissions errors after you install it.

For example, these commands are a common example of a simple project install. (This example uses yarn, but NPM would have similar results.) The addition of 'sudo' is common on Macs, but it can cause the permissions problem.

$ sudo npm install -g angular-cli
$ sudo npm install -g yarn
$ sudo yarn install

The problem occurs when you try to run $ ng serve (or $ ng build); you see an error such as this one:

EACCES: permission denied, open '/Users/cawood/GitHub/test/node_modules/arr-flatten/index.js'
Error: EACCES: permission denied, open '/Users/cawood/GitHub/test/node_modules/arr-flatten/index.js'
    at Error (native)
    at Object.fs.openSync (fs.js:640:18)
    at Object.fs.readFileSync (fs.js:508:33)
    at Object.Module._extensions..js (module.js:578:20)
    at Module.load (module.js:487:32)
    at tryModuleLoad (module.js:446:12)
    at Function.Module._load (module.js:438:3)
    at Module.require (module.js:497:17)
    at require (internal/module.js:20:19)
    at Object. (/Users/cawood/GitHub/mix/node_modules/arr-diff/index.js:10:15) 

The error occurs because the install ran as the admin user. The quick solution is to give the current user sufficient rights to the folder structure in your project. This example is heavy-handed, but it's fine for prototypes. These commands change the permissions recursively for everything (files and folders) in the test directory which is the one that contains the Angular 2 project.

$ sudo chmod -R +rwx test 
$ cd test 
$ ng serve

Tuesday, July 05, 2016

Switching to Google Cloud Platform Storage

After staring down an upcoming bill for website hosting, I quickly decided to switch my website content (including most of the images on this blog) to Google Cloud Platform (GCP) storage buckets. GCP storage buckets for static websites are much cheaper and I liked the idea of having all my content being managed by Google's cloud services.

To setup static content in GCP storage buckets, follow the steps outlined in the Google Cloud article, Hosting a Static Website.

To serve content from your custom domain, you'll need FTP access to upload a file to your current hosting storage. This is how Google verifies that you own the domain.

Once you've verified your domain, you create a storage bucket named the same as your domain (e.g., www.mycustomdomainname.com) and then upload your files to the storage bucket. There are two ways to upload content, you can use the dead easy Storage Browser web interface, or you can use gsutil. gsutil is a Python app that enables command line access to GCP.

Even if you use the web UI to upload your files, you'll probably still want to use gsutil for some additional actions. For example, to set files as public, you have to go into every folder using the UI (it's labourious to say the least), but gsutil provides a simple command to accomplish the same thing extremely quickly. This one command recursively sets every file in the bucket to public read: gsutil -m acl set -R -a public-read gs://bucketname

After you've got your files uploaded and set to public, you can test the links from the Storage Browser and then your bucket is ready to go. The last step is to go to your domain registrar's site and add a CNAME entry for GCP storage. For example, www.mydomainname.com points to c.storage.googleapis.com. After adding the alias, you can verify it's working by using the free CNAME lookup tool on http://mxtoolbox.com.

That's it! You're now saving money and your files are being managed by someone else. Very nice.

Note that if you're moving from Windows hosting to GCP, you might be switching from an IIS web server on Windows to a Linux backend. This means that your new Linux driven URLs will be case sensitive. This actually broke some of my image links because I hadn't matched the case of the file path exactly in the HTML and IIS doesn't care about such details.

Update: I saved over $120 by not renewing my hosting plan and my first Google Cloud Platform monthly bill was 0.08c.