Tuesday, October 05, 2021

10 Things I Undervalued When I Started Playing Minecraft

I started playing Minecraft a few years ago because I thought it would be an interesting game for my creative daughter. That turned out to be true (I even blogged about it - Minecraft bonding with my daughter) but what I didn't anticipate was how much I would enjoy the game. Every Minecraft newbie runs into many pitfalls. However, which one they actually fall into will vary from player to player. These are the things that initially tripped me up or slowed my progress.

minecraft


10. Tridents

At face value, you might assume swords are the best offensive weapon in the game. However, anyone who runs into a drowned with a trident might tell you otherwise. Tridents are certainly harder to come by--you'll probably have to find a few and fix them--but they do more damage. Also, they can be enchanted with Loyalty which turns them into a ranged weapon you can throw.

9. Ender Pearls

Once you have a decent weapon and armour, you can get ender pearls relatively easily. Yes, you have to fight Enderman to do it, but you can win that fight. Ender pearls are incredibly powerful because they allow you to almost instantly teleport away from where you are. They are also one of the most effective defensive techniques in the game. If you get into trouble simply chuck a pearl and you'll be out of danger.

Note: you can't throw an Ender pearl out of lava. I tried that once... but only once. 😉 

8. Charcoal 

Most people agree the very first thing you should do when you start a new world is to cut down a tree. This gives you wood for your first tools and you can get to work making stone tools, and then iron tools after that. However, the one functionality of word that I underestimated is making charcoal. If you build a furnace quickly and cook logs, you get charcoal that you can use to make torches. Once you have torches, instead of hiding from mobs all night, you can use that time to start digging a mine and speed up your ability to find iron, lava, diamonds, and a lot of other useful resources.

7. Trading to get experience points

It's surprisingly lucrative to trade with villagers. Yes, you get emeralds (or other items), but you also get experience points. If you breed villagers, and put down the right work blocks for them, you can trade everything from rotten flesh to stone.

6. Spelunking

At the beginning of the game (when you don't have durable tools), this is by far the best options for gathering iron and coal. Strip mining might be more thorough, but exploring caves doesn't require anywhere near the mining time and you can quickly examine a lot of blocks.

5. Lava as fuel

Look it up, a bucket of lava can cook/burn more than any other type of fuel. This allows you to save your coal for torches and trading with villagers.

4. Enchanting and combining books

Books can even be combined to raise their enchantments. For example, if you combine two Fortune I books into one Fortune II book.

3. Automatic sugarcane farms

Before I learned to build automatic sugarcane farms, levelling up villagers was a colossal pain in the chest plate. But now that I've tried them, I'm completely sold. I would never bother levelling up librarians without sugarcane farms again.

Note: use CrunchBase to make sure you don't build your redstone machines on chunk boundaries. If you build on a boundary, your machine will keep breaking--which is awful. However, it's easy to figure out where the boundaries are and avoid them.

2. Enchanting for durability

I assume this is the biggest "no brainer" on this list, but I'd like to clarify exactly what I mean. Obviously, enchanting is great. However, what I didn't realize when I first started playing was just how key it is to get an enchanting table as fast as possible. The difference between using 'unbreaking' enchanted pick axes and vanilla ones can be the difference between constantly needing iron for tools, and using your iron ore for something else. When I start a new world, I dig down to around level 11 and start looking for diamonds. As soon as I find iron, I can make a bucket and take water down with me. Then, it's a simple matter of finding lava and using my water to create obsidian. Most likely, this is all easier than finding diamonds, but as soon as I do find them, I can make a diamond pickaxe (to mine the obsidian) and an enchanting table.

1. Fishing

A lot of people find fishing boring, but that wasn't why I initially undervalued its potential. At the beginning of a world, you get so much value from fishing. You can fish out experience points, food, saddles, enchanted books, enchanted fishing rods, and enchanted bows. The enchanted bows are probably the most important as you can get an infinity bow without an enchanting table. When you don't have much, an infinity bow is literally a game-changer.

Note: remember that fishing at night in the rain is the most efficient. So build a platform above water so mobs can't get you and fish at night.

That's my list. I hope you found it helpful.

Monday, April 05, 2021

Reel BJJ Public Beta is live!

I'm super excited to have opened up the web application version of Reel BJJ for public Beta feedback. Reel BJJ is a video management application for athletes and coaches who want to get the most out of their Brazilian jiu-jitsu videos. (Try it now--there's a free trial.)

Reel BJJ logo

Why create a jiu-jitsu video app? The answer is simple. There is an amazing library of jiu-jitsu content available from a number of sites such as BJJ FanaticsGrappleArtsGrapplers GuideBJJ Library, and Gracie University--the list goes on. In fact, in a recent interview with Bernardo Faria on BJJ Fanatics, Mikey Muscimeci made exactly this point. The content available to people wanting to learn techniques is seemingly endless. However, the way video is used for learning jiu-jitsu hasn't changed--most people just watch an instructional through and hope they remember the best parts. The mission of Reel BJJ is to change the way people think about learning jiu-jitsu via video.



- Reel BJJ allows users to create Reels without video editing

In short, Reel BJJ can help everyone (athletes, instructors, coaches, or hobbyists) get the most value possible from video content. The first main feature of the Reel BJJ Beta release is called Reels (and before anyone gets upset with me, I created Reels a while back, so yes, it was before Instagram stole my thunder). Reels allow anyone to quickly (and easily) create highlights from their jiu-jitsu videos and then just watch the parts of the videos they want to review. (You can see how Reels work in this short walkthrough video of the Reel BJJ Beta.)

Some of the most common scenarios for Reels would include:

  • Techniques you'd like to learn
  • Techniques you'd like to try before class
  • Review footage of your sparring rounds
  • Review footage of your competition matches
  • Researching opponents for upcoming competitions
  • Creating a game plan of techniques

The key, of course, is that Reel BJJ users can do this without any software install or video editing. Users can add videos from YouTube, Vimeo, or even upload their own videos, and then create highlights. Naturally, you'll want to add notes to your highlights, so Reel BJJ has a cool notes features that was just shipped.

Reel BJJ Beta Notes

It's important to note that Reel BJJ does not download videos from streaming services, or violate their terms of service in any way. The videos remain at the source and aren't altered in any way.

I'm excited to see what will happen. The Beta feedback so far has been fantastic. For example, here's some feedback from Thomas Lisboa.

"It's unbelievable how jiu-jitsu is constantly evolving. That's why I was so excited when I discovered Reel BJJ. I can select the parts I liked most in a course and review them. For those who like to study jiu-jitsu, regardless of their level and goals, Reel BJJ is the evolution of jiu-jitsu study." - Thomas Lisboa (Head Coach of Alliance BJJ Vancouver)


Wednesday, November 11, 2020

Group Policy Prevents Microsoft Apps Working on Windows 10

I recently ran into an odd problem on my home Windows 10 PC. Even though the machine is my home PC and isn't part of a domain, I started getting a Group Policy error any time I tried to use a program that requires a Microsoft account. This is the error:

"This program is blocked by group policy. For more information contact your system administrator. 0x800704ec"

This error can affect any app or program that uses your Microsoft account. This includes things like the Microsoft Store, the Xbox app, the Feeback app, and even Minecraft.

Group Policy Windows 10 issue geeklit.com

Group Policy Windows 10 issue geeklit.com

This error happens when your Microsoft account has been linked to a work or school account that does have a group policy that's preventing the app from running properly. To fix the issue, disassociate the problematic account from your home Microsoft account.

To remove the account, go to Settings > Accounts > Access Work or School. Find any reference to a work or school account associated with your account and choose disconnect. Remember, this is all assuming that you don't mind not having that account associated with your home account.

Tuesday, October 27, 2020

Cypress.io testing with Auth0 and Angular 10

I read the documentation on both the Auth0 site and the Cypress.io site about using the two technologies together with the Angular framework, but I simply couldn't get it to work.

For example, the tutorial on the Auth0 Blog called The Complete Guide to Angular User Authentication with Auth0 got me most of the way there and the same was true for the Cypress.io "real-world app" code and documentation on GitHub (which is written for React BTW).

These two resources (plus some community posts) allowed me to set up all the underpinnings of a working test system but didn't resolve all my issues. In the past, my Angular project exclusively used Google Authentication, so my Cypress tests required me manually login first, and then run all the tests that needed an authenticated user. It wasn't perfect, but it was workable. However, his manual login option did not work with Auth0 (and Angular 10). Google Authentication would not allow the login window to open in an iFrame when running in the context of a Cypress test (i.e., Cypress was automating/controlling the browser).

- Auth0 authenticated tests running successfully in Cypress.io and Angular 10


This is what I ultimately did to get it working:

1.  Switch my Node.js API from Google tokens to Auth0 authentication tokens by following the Auth0 quickstart for Node.js (Express) backends.

2. Set up a test SPA Application in Auth0 with lower security than my production app. This allowed me to enable username/password logins and turn on the Password Grant Types (under Application > Advanced Settings).

3. Follow the Cypress real-world app section called "Cypress Setup for Testing Auth0) to add a Login function to my Cypress.io setup.

Under Cypress > support > commands.ts, I now have the code below. You can also use plain JavaScript of course. The configuration for the variables is found under src > cypress.env.json.

/// <reference types="cypress" />

/// <reference types="jwt-decode" />

import jwt_decode from 'jwt-decode';


Cypress.Commands.add('login', (overrides = {}) => {

  const username = Cypress.env('auth0_username');

  cy.log(`Logging in as ${username}`);

    cy.request({

      method: "POST",

      url: Cypress.env('auth0_url'),

      body: {

        grant_type: 'password',

        username: Cypress.env('auth0_username'),

        password: Cypress.env('auth0_password'),

        audience: Cypress.env('auth0_audience'),

        scope: Cypress.env('auth0_scope'),

        client_id: Cypress.env('auth0_client_id'),

        client_secret: Cypress.env('auth0_client_secret'),  

      },

    }).then(({ body }) => {

      const claims: any = jwt_decode(body.id_token);

      const { nickname, name, picture, updated_at, email, email_verified, sub, exp } = claims;


      const item = {

        body: {

          ...body,

          decodedToken: {

            claims,

            user: {

              nickname,

              name,

              picture,

              updated_at,

              email,

              email_verified,

              sub,

            },

            audience: '',

            client_id: '',

          },

        },

        expiresAt: exp,

      };


    window.localStorage.setItem('auth0Cypress', JSON.stringify(item));

    return body;

  });

});


let LOCAL_STORAGE_MEMORY = {};

Cypress.Commands.add("saveLocalStorageCache", () => {

  Object.keys(localStorage).forEach(key => {

    LOCAL_STORAGE_MEMORY[key] = localStorage[key];

  });

});


Cypress.Commands.add("restoreLocalStorageCache", () => {

  Object.keys(LOCAL_STORAGE_MEMORY).forEach(key => {

    localStorage.setItem(key, LOCAL_STORAGE_MEMORY[key]);

  });

});


4. With this command added to Cypress.io, I can now login programmatically to Auth0 using the following code in my first test. You'll note that the authenticated user details are being written in local storage (but only during testing), and the Auth0 authenticated token is being stored as a cookie--this is the token that I use with my backend API.


describe('Login', () => {

  beforeEach(() => {

    cy.restoreLocalStorageCache();

  });

  

  it('Should successfully login', () => {

    cy.login2()

      .then((resp) => {

        return resp;

      })

      .then((body) => {

        const {access_token, expires_in, id_token} = body;

        const auth0State = {

          nonce: '',

          state: 'some-random-state'

        };


        // write access token to user-token cookie

        cy.setCookie('user-token', access_token);


        const callbackUrl = `/callback#access_token=${access_token}&scope=openid&id_token=${id_token}&expires_in=${expires_in}&token_type=Bearer&state=${auth0State.state}`;

        cy.visit(callbackUrl, {

          onBeforeLoad(win) {

            win.document.cookie = 'com.auth0.auth.some-random-state=' + JSON.stringify(auth0State);

          }

        });

      })

  });

  afterEach(() => {

    cy.saveLocalStorageCache();

  });

});


6. This all seemed to work great, however, Auth0 still would not recognize that the user is authenticated. The Auth0 client would always return false for isAuthenticated. To get around this issue, I had to hack my AuthGuard.

This is what I had before adding Cypress.io:

return this.auth.isAuthenticated$.pipe(

  tap(loggedIn => {

    if (!loggedIn) {

       this.auth.login(state.url);

    }

 })

);


To get around the issue, I simple added another option. If the code is being run by Cypress, I check for the stored user credential and token. I even added a check that it's the right authenticated user, but that's really not needed.


    // @ts-ignore

    if (window.Cypress) {

      const auth0credentials = JSON.parse(localStorage.getItem("auth0Cypress")!);

      const user = auth0credentials.body.decodedToken.user;

      const access_token = auth0credentials.body.access_token;


      if(user.name === 'youtestuser@yourdomain.com' && access_token) {

        return true;

      } else {

        return this.auth.isAuthenticated$.pipe(

          tap(loggedIn => {

            if (!loggedIn) {

              } else {

                this.auth.login(state.url);

              }

            })

          );

      };

    } else {

      return this.auth.isAuthenticated$.pipe(

        tap(loggedIn => {

          if (!loggedIn) {

            this.auth.login(state.url);

          }

        })

      );

    }


Well, I think that's everything. I hope there is a better answer coming as this hack around the AuthGuard is not a prefect solution, but it does let me move forward and that's promising.


Wednesday, May 27, 2020

Debugging TypeScript Phaser Apps Server and Client-side Using VS Code

I’m using the Phaser game development platform to teach my kids some video game design and programming concepts (I previously used LEGO to draw my daughter into building game sprites). As a new Phaser user, I'm finding there isn't enough writing out there about using the Phaser 3 HTML5 game development platform with TypeScript, so I feel I should add my 0.02c. My first contribution is about debugging with breakpoints in Phaser and Visual Studio Code (VSCode).

Note: You can read more about setting up VSCode for TypeScript debugging on the Microsoft site, but one tip is to remember you'll need to make sure you have "sourceMap": true in your tsconfig.json file.

Most likely, if you're using Phaser for a multi-player game, you're probably using Node.js and Express to fire up a server for the Phaser client-side code. I found this complicated debugging with breakpoints in VS Code, and I haven't been able to get a single configuration to hit breakpoints in both the server and client-side code. This makes sense, but I still thought it would be handled more easily.

If I launch a debugger for the server, any breakpoints I've set in my client-side code show "Breakpoint set but not yet bound" or the more direct error, "Breakpoint ignored because generated code not found (source map problem?)" My workaround is to launch the debugger depending on which breakpoints I need.

Server-side debugging

To debug the server, use VSCode attach to process and choose the dist/server.js process (server.js is the transpiled version of my server.ts file that starts the Node.js server).

VS Code server-side breakpoint Geeklit Blog


Client-side debugging

To debug the client (uses Visual Studio Code Chrome debugger extension), I use VSCode to launch the debugger with the Launch client-side task.

Here is the entire launch.json file that supports both client and server-side debugging with breakpoints: 

{
    // Use IntelliSense to learn about possible attributes.
    // Hover to view descriptions of existing attributes.
    // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
    "version": "0.2.0",
    "configurations": [
        {
            "name": "Launch client-side: localhost:8080",
            "type": "chrome",
            "request": "launch",
            "url": "http://localhost:8080/index.html",
            "webRoot": "${workspaceFolder}/dist/client"
        },
        {
            "name": "Attach to Process dist/server.js",
            "type": "node",
            "request": "attach",
            "processId": "${command:PickProcess}",
            "port": 8080
        }
    ]
}

Tuesday, April 21, 2020

Running NPM in Windows Terminal Ubuntu profile

I got some nasty errors when I tried to use NPM and Node.js in the new Windows Terminal for Windows 10 (which BTW has a cool split-pane feature that might just make it my new goto terminal).

The errors looked like this:

 not foundram Files/nodejs/npm: 3: /mnt/c/Program Files/nodejs/npm:
: not foundram Files/nodejs/npm: 5: /mnt/c/Program Files/nodejs/npm:
/mnt/c/Program Files/nodejs/npm: 6: /mnt/c/Program Files/nodejs/npm: Syntax error: word unexpected (expecting "in")

Here's the fix

$ sudo apt-get update
$ sudo apt-get install -y nodejs
$ sudo apt-get install build-essential
$ sudo apt get install npm
**CLOSE AND REOPEN TERMINAL**

I found it here: https://github.com/microsoft/WSL/issues/1512

BTW -- example split-pane keys: Alt+Shift+-, Alt+Shift+= will split horizontally and vertically.

Sunday, June 16, 2019

Minecraft Bonding With My Daughter


I hadn't played much Minecraft since a co-worker showed me an original beta version, but I knew enough about the best selling PC game to suggest that my creative daughter give it a try. I thought it would be a great outlet for her crafty passion and also a nice way to get some more exposure to computers. At first, we tried Creative mode and having just watched The Swiss Family Robinson, we jumped into building a treehouse. We spent a reasonable amount of time working on that world and then one (or both?) of us suggested we try Survival mode.

Survival mode is a real-time strategy game and is therefore highly addictive. Before we knew it, we were using any time we could to build up our world and it got quite intense. The first time my daughter fell into lava, she was distraught because she had lost everything she was carrying: tools, weapons, supplies, etc. I started to regret ever leaving Creative mode, but things calmed down and we're now at a reasonable place where it's just one of the things we do and it's not such a big deal if someone goes swimming in lava.

Here's a screenshot of the first survival house we built that wasn't just a shelter from monsters.

Minecraft house cawood blog geeklit.com


The best part is that she got interested in Redstone engineering. For anyone who doesn't know, Redstone is the Minecraft way to create circuits that can power machines, traps, and other things in Minecraft. My wife suggested a lighthouse would be cool, so we build a functioning lighthouse together using Redstone. She even has her own Creative mode test world where she tries out Redstone inventions before she uses them in her other worlds. It's super cool.

I highly recommend this sort of cooperative play with your kids--just watch those screen time limits. We've thoroughly enjoyed the experience and my daughter has even started doing a Minecraft after school program.

Sunday, April 08, 2018

Deploying Python app to Google Cloud Platform (GCP) fails for numpy==1.9.3

I'm not sure what caused this issue. I've had this GCP Python app running for many months, but today the deploy command failed with this error.


$ gcloud app deploy ~/appFolder/app.yaml ~/appFolder/cron.yaml --verbosity=error

Step #1:   Could not find a version that satisfies the requirement numpy==1.9.3 (from versions: 1.14.5, 1.14.6, 1.15.0rc2, 1.15.0, 1.15.1, 1.15.2, 1.15.3, 1.15.4, 1.16.0rc1, 1.16.0rc2, 1.16.0, 1.16.1)
Step #1: No matching distribution found for numpy==1.9.3
Step #1: You are using pip version 10.0.1, however version 19.0.1 is available.
Step #1: You should consider upgrading via the 'pip install --upgrade pip' command.
Step #1: The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1
Finished Step #1
ERROR
ERROR: build step 1 "gcr.io/cloud-builders/docker@sha256:1f0bba269252a1b2cde650368ddc970afe80207986a88b8a0011e94b8xxxxxx" failed: exit status 1
Step #1: 
-------------------------------------------------------------------------------------------------------------


ERROR: (gcloud.app.deploy) Cloud build failed. 

To resolve the issue, I changed this line in requirements.txt from:

pandas==0.22.0

to:

pandas; python_version >= '3.5'
pandas<0.21; python_version == '3.4'

Then I ran:


$ pip install  -r requirements.txt

I found this solution in a comment on this GitHub issue: https://github.com/pandas-dev/pandas/issues/20697

Saturday, March 31, 2018

Using PS3 wireless controllers with SUMOSYS

I got some inexpensive unofficial PS3 wireless controllers for my SUMOSYS retro game console and the standard setup instructions didn't work. Here's how I got them working with SUMOSYS 700.

1. Connect one working default controller and one new wireless controller (with the USB cable).
(In my case the wireless controller was constantly vibrating--don't worry about this.)
2. Go to the menu, choose controller setup and select the buttons. The controller should now work with the cable attached (but possibly still vibrating).
3. Detach the cable from the new controller and go to the controller menu again with the controller flashing for Bluetooth connection. I didn't have to do anything else here--it just found it.
4. The wireless controller was now working, so I unplugged the old wired controller, went to the controller menu and assigned the new controller to P1 (player 1 in games).
5. Used the menu to shut down. This saves the first controller.
Repeat the process for the other controllers, but you can skip connecting the wired controller since you have a working wireless controller to navigate through the menus.

I tried 1942 and Mario Kart and they work great so far. Big improvement!

Note: to exit games, I use Start + Hotkey (which I set to the Home button)

Tuesday, February 13, 2018

How to Create a Kid-safe YouTube Playlist

Now that I've got the Beta of Hive Video live, I'd like some feedback, so I'm getting the word out about some of the best use cases for this free application. While it's true that Hive Video is the simple alternative to video editing, it's also true that there's more to the app than creating free highlight reel playlists from YouTube videos.

free kid-safe video playlists on tunnel video

Two use cases that are perfect for Hive Video are easily creating replay highlights for sports teams and kid-friendly video playlists for parents. In this post, we'll dive into the second one...

As a stay-at-home dad, I'm well aware of one of the modern parenting difficulties: limiting screen time and controlling what your kids are watching. Hive Video highlight reels solve many issues:

  • since you only add the videos you want, you always know what your kids are watching
  • you can set the start and end time of each clip, so you also know exactly which part of each video your children are seeing
  • all of the content is streaming from YouTube, but since you're not on the YouTube site (or app), there are no comments and no recommended videos
tunnel video free highlight reel playlists for YouTube
Even the YouTube Kids app does not address these issues because suggested videos are easily accessible to little fingers. It drives my wife crazy when our kids are watching a video and they start tapping around to open up other videos. Even if you're sitting next to them, they can do it too quickly for you to stop them, and then they want to watch those other videos they discovered. In our house, it's the unboxing videos that always used to come up--that is before we created Hive Video playlists.

Try the free Hive Video app and I recommend you register (also free) so you can manage multiple highlight reels under your account.

Tuesday, January 30, 2018

AvePoint Takes Fire for Unethical Marketing

Having worked in the Microsoft SharePoint space for many years, I know that it's unusual for SharePoint Independent Software Vendors (ISVs) to get much mainstream press coverage, so I was surprised to see a Washington Business article about an AvePoint marketing campaign. The article, All's fair in sales and marketing? Competitor blogs that Metalogix is for sale, tries to poach customers, highlights a shady marketing tactic recently employed by AvePoint against my former employer, Metalogix.

(Update: The Metalogix response, "Metalogix is Forever" has been posted now.)


Others have written about this dishonourable marketing move (for example: An Open Letter to AvePoint | An Unethical Marketing Campaign), and I can understand their frustration. The SharePoint partner community used to be a pretty tight-knit group focused on "coopetition." Sure, we were competing, but we also worked together to raise the profile of SharePoint--which BTW was a great strategy that paid off for everyone. As they say, 'a rising tide raises all boats.'

But that's just not the case anymore, as money poured into the situation, investors took notice and the atmosphere changed. A win at all costs mentality emerged from some companies and the whole situation was a lot less interesting for many of us who started in content management before the SharePoint era. I suspect this marketing ploy will backfire.




Sunday, December 31, 2017

Hive Video - Free Highlight Reel Creator for YouTube is Live!



I'm thrilled to return to blogging after a crazy few months working on my web app for YouTube videos: Hive Video. The Beta is live so head on over and you can create free highlight reels from YouTube videos.

Hive Video is the simple alternative to video editing. Whether you're looking to highlight the best parts of a tutorial, put together the best moments of various videos (e.g., sports), or just create a kid-friendly playlist without comments, ads, or recommended videos--it's easy, free, and doesn't require any install. 

Once you've created your highlight reel or mashup of YouTube videos, you can quickly share the results with a public view URL. Send it via social media, email, or whatever, it's just plain easy.

Check it out! I'd appreciate any feedback.

My GitHub heat map for 2017 tells the story pretty clearly--I started the project in April.

Cawood Tunnel Video GitHub heat map

This new project is under the umbrella of my software startup: Second Marker.

Update: we now support Twitch VOD videos as well.


Tuesday, November 14, 2017

How to View Files in a Google App Engine Docker Container

This post has been adapted (abridged) from the Google Cloud Platform (GCP) Debugging an Instance documentation. I wanted to see the file structure that was being deployed to my Angular 5 application for my YouTube video highlight web app (Tunnel Video). These are the steps required.



1. Go to your App Engine instances in the GCP console

https://console.cloud.google.com/appengine/instances

2. Find the instance and choose the SSH option to open a terminal in a browser window

3. Once the terminal window for the instance opens, list the docker containers: $ sudo docker ps

4. Find the docker container in the list with your project name in it and then run: $ container_exec containerId /bin/bash

5. This will open a shell in your container. From there, just list your files as usual: ls app


Wednesday, October 18, 2017

Postman with Authenticated Google API HTTP Requests

Let's say I want to send an HTTP GET request to a Google API using the super helpful Postman application. That's simple enough, I'll just choose GET and enter the URL. In this example, I'll search YouTube for videos with "test" in their details.

https://www.googleapis.com/youtube/v3/search?q=test&key=AIzaSyDbnfRUVxqMEJqYwxxxxx-uV75_GaPTVr0&part=snippet&type=video

Breaking down this URL, you'll see that the query ("q") for "test" is in there and I'm asking for only the 'snippet' details of videos to be returned ("type=video"). Without the type optional parameter, the search would also return channels.



The only cumbersome thing about this GET request is that I need a Google API Key. These keys allow Google to limit how many API requests any account is making--and potentially charge the user when the count gets too high. To get a key, you need to a Google Cloud Platform (GCP) account and then follow the instructions to create an API key from the API Manager section.

Since API keys are--by definition--limited, most people try to keep them private and restrict who can use them. There are a few different ways to add API Key Restrictions in GCP. "HTTP referrers (websites)" is a popular and straightforward option. For example, if you were using the key from a local website, you could add http://localhost:/* to the list.

Note: If you don't care about your key being used by others, then you can leave the key with no restrictions and skip the next part.

In Postman, you can pretend you're sending the request from a local website by adding a "Referer" header entry. However, since it's a restricted header, there is an extra step. You must turn on Postman Interceptor by clicking the Interceptor button at the top of the window (next to the sync button). If you have interceptor off, the Referer header entry will be ignored and you'll get an error: "Error 403:The request did not specify any referer. Please ensure that the client is sending referer or use the API Console to remove the referer restrictions."

Now you're sending the GET request with an API key and you're getting back JSON results that look like this:

{
    "kind": "youtube#searchListResponse",
    "etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/71Y1Pa_Vox_0ZzzjdbBNppwdf0s\"",
    "nextPageToken": "CAUQAA",
    "regionCode": "CA",
    "pageInfo": {
        "totalResults": 1000000,
        "resultsPerPage": 5
    },
    "items": [
        {
            "kind": "youtube#searchResult",
            "etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/WePlVVP0Z4fWK6zl92pA9jVLbdQ\"",
            "id": {
                "kind": "youtube#video",

...

For more about the Google API options for YouTube Search, refer to the YouTube developer documentation.

Just that query alone is useful, but there's still one key thing missing from our request--authentication. I'm not going to go into detail about OAuth authentication here--you can read about that elsewhere--so this section will just follow the basic steps. To use the Google APIs as an authenticated user, you need an OAuth token.

There are a few ways to get a Google OAuth token. Probably the simplest option is to use Google's own OAuth playground. This is a super useful app that allows you to fiddle with all sorts of settings.

Another option is to use Postman's built in Get New Access Token feature. To use it, click on Authorization (next to the Headers tab) and then the "Get New Access Token" button. Here you would enter the Auth URL as https://accounts.google.com/o/oauth2/auth and the Access Token URL as https://accounts.google.com/o/oauth2/token. You also need to know the Scope for your request--which you probably just want to go to Google's OAuth Playground to get anyway.

Once you have an access token, a simple method to check its validity is to paste this URL into a browser address bar: https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=

Note: Checking the token will show its expiration time and also its scopes.

Tuesday, September 19, 2017

Google Cloud Platform for a Full-stack Angular Web Application

At first blush, getting a complete Angular web application deployed and running on the Google Cloud Platform (GCP) can be a daunting task. Each quickstart tutorial is reasonable enough, but if you read ahead through all of the various documentation pages you'll need, they can soon begin to snowball.



UPDATE: After I started writing this post, I discovered a Google lab that covers the same topic: Build a Node.js & Angular Web App Using Google Cloud Platform.

I'll just quickly provide an overview of deploying Angular as the front-end app (on Google App Engine) and MySQL with a Node.js API on the backend. MySQL will run on Cloud SQL and Node.js runs on App Engine.

The basic steps involved are:

1. Set up a GCP account and project

2. Download the Angular sample and install the requirements

3. Deploy your Angular front-end app to GCP

4. Set up a MySQL DB on Cloud SQL. I wrote a whole post about setting on MySQL on Google Cloud SQL.

5. Complete the tutorial for using Google Cloud with Node.js

Since these samples aren't linked, you'll have to test the pieces separately until you develop some interaction in your Angular app. You can use Postman to test your Node.js API.

Monday, August 14, 2017

Set up MySQL on Google Cloud Platform

Here's a walkthrough of how to set up MySQL on Google Cloud Platform's (GCP) Cloud SQL service.



1. Create a Google account if you don't have one: https://accounts.google.com

2. Sign up for a free trial on: https://console.cloud.google.com

3. Create a GCP project to house your MySQL instance.

4. Create a Cloud SQL instance--obviously, choosing MySQL as the DB type.

5. Create a database in the instance by going to the Databases tab under the Instance. For this post, I'll create one called "test".



6. To add a schema to your DB, you can use the Cloud SQL command line, https://dbdesigner.net or MySQL Workbench. Of course, if you know SQL, you can just type out your schema, but if you want a visual representation of your DB, these tools are handy.

I like the simplicity and no-install aspect of dbdesginer.net, but I ran into a bug, so I don't use it as much as I would otherwise. Hopefully, they'll fix it soon. Until then, I guess I have to recommend MySQL Workbench since it's great in many ways.

7. If you just want to quickly test that your DB is working. You can run some simple queries in the Google Cloud Shell. To connect to your MySQL instance, open the shell (little command line icon in the top-right of the GCP dashboard.) and type in these commands:

googleusername@projectId:~$ gcloud beta sql connect dbinstance --user=dbusername (usually root for username)

You'll be prompted for your password and then you'll see the MySQL prompt. Note that it might take a few seconds before it appears.

Whitelisting your IP for incoming connection for 5 minutes...\
Enter password:

Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 1234
Server version: 5.7.14-google-log (Google)
Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql>;

8. To run your test queries, you'll need to select your DB within your instance.

mysql>; use test
Database changed

Once you've selected the DB, you can run a simple operation to create a table and then use SHOW TABLES; to see the results.

The command I'll use to create a simple table is:

CREATE TABLE `testtable` (
`id` INT NOT NULL AUTO_INCREMENT,
PRIMARY KEY (`id`)
);

This creates a new table with one column called "id".

And this is what it looks like when I paste this command into the Cloud Shell:

mysql>; CREATE TABLE `testtable` (
    >; `id` INT NOT NULL AUTO_INCREMENT,
    >; PRIMARY KEY (`id`)
    >; );
Query OK, 0 rows affected (0.03 sec)
mysql>; show tables;
+----------------+
| Tables_in_test |
+----------------+
| testtable      |
+----------------+
1 row in set (0.00 sec)
mysql>;

9. Now that you have a working DB, you'll want to connect to it from code and possibly from a MySQL client (e.g., the 'dolphin' a.k.a. MySQL Workbench).

To get the info you need, go to the Instance Details page in GCP and look at the Properties under the Overview tab. What you need is the "IPv4 address". Grab that, but you won't be able to connect to your instance yet. First, you have to allow connections.

To allow your client to connect to your DB in the Google cloud, you need to open Authorized Networks and add your network IP. To get your client IP, type "what's my IP" into Google.

10. Now you can access your DB from a MySQL client, but what about a local dev instance of a GCP App Engine application? For that you need a Google Cloud SQL Proxy running.

NOTE: The default setting is for all apps within the same project to be authorized, if you're using the DB from a different GCP project, you'll have to authorize it explicitly.

Without the Cloud SQL Proxy running, your local dev instance of your app will return the error: Error: connect ECONNREFUSED xxx.0.0.xxx:3306.

You'll need to install the Google Cloud SDK (configuration will have you select the project). If you have multiple environments [e.g., test, dev, prod, etc. You may have to change this as some point with this command: cawood$ gcloud config set project projectName]), and enable the Cloud SQL API, if you haven't already. You'll also need a credential file to supply to the proxy. To get this file, create a service account in the IAM & Admin section of GCP. Make sure you grant the account rights to Cloud SQL. When you create the account, you'll be able to create a new private key and download a JSON credential file. The Cloud Proxy will read this credential file.

https://cloud.google.com/sql/docs/mysql/connect-external-app

11. Once you have everything ready, here is the command to start the Cloud Proxy:
./cloud_sql_proxy -instances=instanceConnectionName=tcp:3306 \ -credential_file=credentials.json &

If all goes well, the Cloud Proxy command will return this output:

cawood$ 2017/04/23 22:34:39 Listening on 127.0.0.1:3306 for instanceConnectionName
2017/04/23 22:34:39 Ready for new connections


FYI: From the command line help:
Authorization:
  * On Google Compute Engine, the default service account is used.
    The Cloud SQL API must be enabled for the VM.

  * When gcloud is installed on the local machine, the "active account" is used
    for authentication. Run 'gcloud auth list' to see which accounts are
    installed on your local machine and 'gcloud config list account' to view
    the active account.

  * To configure the proxy using a service account, pass the -credential_file
    flag or set the GOOGLE_APPLICATION_CREDENTIALS environment variable. This
    will override gcloud or GCE credentials (if they exist).

Tuesday, July 11, 2017

Second Marker Software

My next project has begun. I’m working on a software startup called Second Marker.

Second Marker Software


Our first solution—and the inspiration for starting the company—is a free and easy alternative to video editing: Tunnel Video. If you want to make a highlight reel of YouTube videos, this will be the simplest solution.

There are endless possibilities for why someone would want to create and share a playlist of YouTube video with start and end times, but some use cases include:

Update: try the Beta for free at http://www.tunnelvideo.com!

Wednesday, June 21, 2017

Google Cloud SQL Proxy on Same Machine as MySQL

As a web developer, it makes sense that you'd want to connect to my Google Cloud SQL instances running in Google Cloud Platform (GCP) via Cloud SQL Proxy and also have MySQL running locally on your machine. Unfortunately, the default configurations for these two systems will cause an error when you try to run the proxy: "bind: address already in use."

One clear reason to run the proxy is that it allows you to securely connect to your Cloud SQL instances with MySQL Workbench. You can get the IP (hostname) and port from GCP no problem, but to connect, you need to be running Google's proxy. (You also need to create a JSON credential file from GCP and add your IP to the Authorized Networks BTW.)

The reason for the error is that both the proxy and the local default install of MySQL will try to communicate on port 3306. When you try to start the proxy, this is the result:


doom:project cawood$ ./cloud_sql_proxy -instances=instancename-9999:us-west1:instance=tcp:3306 -credential_file=CredFileName-dflkjal.json &

doom:project cawood$ using credential file for authentication; email=test@test.999999.iam.gserviceaccount.com
2017/06/05 23:51:24 listen tcp 127.0.0.1:3306: bind: address already in use

To resolve this error, one solution is to create a config file for the local MySQL server and change the port.

One thing to note (as per the MySQL documentation) is that the installer for macOS will not create a config file: "Note As of MySQL 5.7.18, my-default.cnf is no longer included in or installed by distribution packages." If you don't have a config file already... to create the config file, simply run these commands.


$ cd /etc
$ sudo nano my.cnf

This is all you need in the file:

[client]
port = 3366

[mysqld]
port = 3366

Once you're done use CTRL + o to save and exit Nano.

It's worth noting, that I used to see this error before I installed MySQL locally. The reason was that there was another Cloud SQL Proxy process ("cloud_sql_proxy") already running. The solution to that is simply to kill the process.

Thursday, May 04, 2017

Wednesday, April 12, 2017

Node.js Authentication Error Connecting to Google Cloud SQL Proxy

I was receiving this error trying to run a local instance of a Node.js API against a Google Cloud Platform (GCP) database.

Error: ER_ACCESS_DENIED_ERROR: Access denied for user ''@'cloudsqlproxy~174.7.116.43' (using password: NO) at Handshake.Sequence._packetToError (/Users/cawood/GitHub/project/node_modules/mysql/lib/protocol/sequences/Sequence.js:52:14) at Handshake.ErrorPacket (/Users/cawood/GitHub/project/node_modules/mysql/lib/protocol/sequences/Handshake.js:103:18) at emitOne (events.js:96:13) at Socket.emit (events.js:191:7) at readableAddChunk (_stream_readable.js:178:18) at Socket.Readable.push (_stream_readable.js:136:10) 

The solution was quite straightforward, but when I first googled the error, I couldn't find anything about the access denied error showing no user--the username should be in the error (i.e. 'username'@'cloudsqlproxy).

I thought the error was the GCP Cloud SQL Proxy, but I was initiating it correctly and with a valid credential file for a service account:

$ ./cloud_sql_proxy -instances=name-111111:us-central1:instancename=tcp:0000 \ -credential_file=serviceAccountCreds.json &

The problem was actually with the environment variables for my local instance of the Node.js API. I hadn't exported them. Of course, the error was correct, I wasn't trying to connect with any user at all.

$ export MYSQL_USER="username" 
$ export MYSQL_PASSWORD="password" 
$ export MYSQL_DATABASE="test" 
$ npm start

To check if you have set these correctly, you can output them to the console when you start the server: console.log(config);

cawood$ npm start 
> projectapi@0.0.1 start /Users/cawood/GitHub/project
> node server.js 
 { user: 'username', password: 'password', database: 'test' } 
App listening on port 8000 Press Ctrl+C to quit.