What is GlassFish?

I jumped down another rabbit hole trying to figure out how to get started with java ee without using an ide. Although IDE’s are very handy when it comes to Java development, they also are sometimes a crutch. For instance, if you want to transition to CI, do you actually know what commands the IDE runs when you right click and run tests?

First, I have no idea what Java EE actually is. There is something called GlassFish, which is an open source Java EE “reference implementation”. It also the same thing that is installed when you go to the main Java EE website.

Java EE does not support the latest Java JDK 1.9. On my Mac I had a tough time trying to get two versions of Java to run at the same time.

I think 99.9% of all tutorials about getting started with Java EE include using Netbeans or Eclipse. I wanted to write one that used the CLI. This involves using maven.

Maven has a concept called “archetypes” which creates the necessary directory structure for a new Java project. The main problem is that I could not find a bare bones archetype definition.

At the end of the day, I dug deep into the rabbit hole and came up empty. I will figure this out at some point and write a blog post about it.

Learn Kubernetes with Interactive Tutorials

I wanted to get a deeper understanding of how Kubernetes actually works, so I started to work through the tutorials on the kubernetes documentation website.  Kubernetes is a container orchestration system that creates some standard tooling for deploying, scaling, and managing containers at scale.

The tutorials themselves, are amazing.

The tutorials use Katacoda to run a virtual terminal in your web browser that runs Minikube, a small-scale local deployment of Kubernetes that can run anywhere.

At a high level kubernetes allows you to deploy a cluster of resources as a single unit without having to really think about the underlying individual hosts. It follows a master -> node model where there is a centralized control point for managing your cluster and worker nodes that perform the actions that your application needs.

Kubernetes supports running both Docker containers and rkt containers. I’m pretty familiar with Docker. I learned more than I ever wanted to over the last few years of working at CircleCI. I have never used rkt, but am looking forward to learning more in the future.

It is really neat that you can simulate a production-like instance on your local computer using minikube. This is a great way to learn kubernetes as well as be able to do local development.

Kubernetes docs has some interactive tutorials that allow you to get your hands dirty with Kubernetes without having to install anything. These tutorials are powered by KataCoda, a tool that I am not familiar with. This is a neat web service that allows you to learn new technologies in your browser.

Kubernetes in your Browser
Kubernetes in your Browser

The first tutorial teaches you how to use minikube, and the kubectl cli to create a new cluster.

One of the most amazing parts of kubernetes to me is the self-healing aspect. For example once you have defined what your application stack consists of, if a node happens to go down then kubernetes will automatically replace it with another instance.

Not only does the interactive online tutorial allow you to use a real kubernetes cluster from within your browser, you can even preview the web UI portion of the cluster as well as viewing your application running.

Kubernetes Web UI
Kubernetes Web UI

This is such a great way to learn.

Apex Triggers

I worked on the Apex Triggers module on trailhead. Apex triggers are very similar to database triggers (remember those?). I remember in my first job, which was an enterprise healthcare company, our DB was littered with hundreds of triggers that did various actions whenever records were inserted, updated, or removed.

Triggers are a powerful concept, but tend to be very difficult to maintain at a large scale. Especially when you have a large team. I think they are an artifact of the legacy development methodologies. These days most of the actions that triggers used to be responsible for are managed as either a part of the model, or as separate background tasks.

Despite this being true in most modern software development, Salesforce allows you to write triggers in a first class way that do things when records change. I think this is a case where they are still “ok” to use because they remove a lot of the overhead with having to figure out how to keep track of the state of all of your various records.

The best part about Apex triggers is that unlike DB triggers which require you to write your code in an enhanced variant of SQL, Apex triggers allow you to write the code in Apex. This means that you can take full advantage of all of the built in salesforce libraries, as well as making HTTP callouts (the most powerful part of all of this) in a really simple way.

One thing to note is that if you do make HTTP callouts with Apex, you must do so asynchronously.

Apex triggers have a handy access to the context that fired the trigger, including both the old and new state of the affected object.

One great hint that the module gives us is to write our code to support both single and bulk operations. While most triggers that I have written operate on only a single object at a time; there may come a day when I may want to do work on multiple objects at a time. For example, if I was using the bulk API. By writing the code in a way that supports bulk operations (essentially using a for loop) you can reuse the same code in the future rather than having to handle both cases separately.

 

Salesforce DX External Sharing Model

I was working through the Getting Started with Salesforce DX module on trailhead and when it came time to push the Dreamhouse app up to my scratch org I got a dozen or so error messages complaining about all sorts of things.

PROJECT PATH                                                                                    ERROR
──────────────────────────────────────────────────────────────────────────────────────────────  ──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
force-app/main/default/objects/Property__c/Property__c.object-meta.xml                          Can't specify an external sharing model for Property__c
force-app/main/default/objects/Favorite__c/fields/Property__c.field-meta.xml                    referenceTo value of 'Property__c' does not resolve to a valid sObject type (65:13)
force-app/main/default/objects/Favorite__c/listViews/All.listView-meta.xml                      In field: columns - no CustomField named Favorite__c.Property__c found (88:16)
force-app/main/default/layouts/Broker__c-Broker Layout.layout-meta.xml                          In field: relatedList - no CustomField named Property__c.Broker__c found (81:19)
force-app/main/default/layouts/Favorite__c-Favorite Layout.layout-meta.xml                      In field: field - no CustomField named Favorite__c.Property__c found (13:26)

Luckily the error messages are pretty useful. In this case it looks like the “External Sharing Model” was not turned on in my scratch org. This appears to be turned off by default.

In order to get this step to work:

    1. Log into your scratch org sfdx force:org:open
    2. Go to Setup
    3. In the Quick Search box look for Sharing Settings
    4. Click on Enable External Sharing Model 

Sharing_Settings___SalesforceNow you can run the push command and deploy the Dreamhouse app without any issues.

Keep on trailbalazing!

 

Oskar’s Heavy Boots

In Extremely Loud and Incredibly Close, Jonathan Safran Foer tells the story of a young boy named Oskar who is on a quest to come to terms with the sudden death of his father in the 9/11 attacks. While rummaging through his father’s belongings a few days after the tragic events of that day, he finds a mysterious key inside a vase. Determined to find the lock that it belongs to, he travels around all of New York city in search of closure.  Foer captures the voice of a nine year old boy perfectly. We are immediately attached to him and his terrible loss and spend the rest of the book hoping that he succeeds in his journey.


EXTREMELY LOUD AND INCREDIBLY CLOSE
By Jonathan Safran Foer
368 pp. Mariner Books $25

September 11th was not the only tragedy that was covered in this book. A generation earlier, Oskar’s grandfather survived the Bombing of Dresden. While he walked away with his life, he chose to live his life as a victim rather than a survivor. He leaves Oskar’s grandmother abruptly, loses the ability to speak, and spends many years writing letters to his son (Oskar’s father) which he never delivers before his death.

The book consists of intertwined segments. The main story is pushed along via Oskar’s narration. Pieces of the past are presented in the form of letters from his Grandparents. It explores a wide range of emotions including tragedy, loss, love and regret.

I regret that it takes a life to learn how to live, Oskar. Because if I were able to live my life again, I would do things differently.

Oskar slowly finds a way to cope with his fathers death. Throughout his journey he comes up with many provocative metaphors. The one that stood out the most to me was comparing life to a building on fire.

Everything that’s born has to die, which means our lives are like skyscrapers. The smoke rises at different speeds, but they’re all on fire, and we’re all trapped.

It’s difficult to read this book even a decade after the terrible events of that day. Those of us who were witnesses were changed forever in one way or another. An entire generation has now grown up viewing life from the lens of everything that happened before 9/11 and everything that has happened after.

We are quickly approaching a date where everyone under the age of 18 will have been born after September 11, 2001. I imagine they will grow up to view this day similar to how people in their 30’s and 40’s think about Pearl Harbor or the bombing of Hiroshima; a terrible event that happened long ago but has little emotional connection to every day reality. Historical fiction books are important in this regard. Unlike the non-fiction books that tell an objective story with facts, figures, and death tolls, fiction allows us to view the event from the perspective of a real human being. We feel something more than shock. We learn something more than a statistic or a timeline of events.

I thought if everyone could see what I saw, we would never have war anymore.

This book does not have a happy ending. We walk away feeling the same hopelessness and loss that Oskar does. Our boots become very heavy. The next 9/11, Hiroshima, Bombing of Dresden, Rape of Nanking, or < INSERT NAME OF TRAGEDY HERE >, is potentially days away. I would love to live in a world where books like this one were pure fiction, instead of based on a true story.

 

Obama’s Journey to Discover His Roots

In 1995, after becoming the first black president of the Harvard Law Review, a young and relatively unknown politician named Barack Obama wrote a candid memoir tracing his quest to discover who he was.


DREAMS FROM MY FATHER
By Barack Obama
466 pp. Random House $17

Obama begins with a recount of his childhood growing up in Honolulu where he was estranged from his father at a very young age. His father was from Kenya and his mother was a white woman from the midwest.

It couldn’t have been easy growing up as a mixed race person in the 1960s and 70s. Race relations in the United States were at a breaking point and every bit of progress that was made with legislation seemed to not quite be enough to change the attitudes of the general population. His struggle with identity, belonging, and purpose continued throughout his childhood and into his later years.

He had a strong support structure thanks to his mother and grandparents. They accepted him, encouraged him, and ensured that he was given the tools that he needed to succeed. Unfortunately, their support was not quite enough to calm the gnawing feeling of not belonging.

Know where you belong, he advised. He made it sound simple, like calling directory assistance. “Information—what city, please?” “Uh … I’m not sure. I was hoping you could tell me. The name’s Obama. Where do I belong?”

Obama, Barack. Dreams from My Father: A Story of Race and Inheritance (pp. 114-115). Crown/Archetype. Kindle Edition.

Obama had limited engagement with his father growing up. They mostly communicated via letters. His father’s advice to him was to known where he belongs. With this advice in hand, upon completing his undergraduate studies Obama began exploring activism and political organizing.

The next part of the book chronicles his work as a community organizer in Chicago. We learn about the struggles of the community and the long hours and hard fought battles that took place in order to make any sort of progress.

The last part of the book goes into detail into Obama’s journey to Kenya to meet his fathers side of the family. It was common in Kenya for men to have multiples wives which resulted in very large families. We are introduced to close and distant relatives through a series of vivid recollections of the conversations, stories, and experiences that took place.

Obama’s writing style and voice is superb. He tells an honest story and produces rich characters that we can relate to through the brief vignettes that we are shown. His descriptions of the people, places, and things that he encounters on his quest transport the reader from the beautiful islands of Hawaii, to the chilly slums of Chicago, all the way to the arid plains of Kenya. It is amazing to witness the level of detail that went in to developing the compelling dialog and meaningful stories that are scattered throughout the memoir.

In the epilogue, Obama laments the challenges of studying and practicing law.

The study of law can be disappointing at times, a matter of applying narrow rules and arcane procedure to an uncooperative reality;

Obama, Barack. Dreams from My Father: A Story of Race and Inheritance (p. 437). Crown/Archetype. Kindle Edition.

He poses a question for us to think about.

How do we transform mere power into justice, mere sentiment into love?

Obama, Barack. Dreams from My Father: A Story of Race and Inheritance (p. 438). Crown/Archetype. Kindle Edition.

Many leaders don’t start to write books until they are well into the prime of their careers. This peek into the early part of Obamas life written at a time before he became one of the most powerful people on Earth provides us with a unique perspective that helps us understand his character and values. Obama’s story has unique twists, but the general theme is a universal one and inspires all who are struggling to find where they belong in this world.

Looking Back on 2017

2017 was a challenging year for our society. The political climate in the United States is hostile, uncertainly clouds the future, and in many ways it felt like we took several steps back as a nation. Luckily there are glimmers of hope and I look forward to seeing what 2018 brings us. I wanted to take a moment to reflect on all of the things that happened to me this year.

I continued my journey to the state capitals as a part of my Tralev project. By far the most memorable trip was visiting Honolulu with my family. I slowed down a bit toward the end of the year for various reasons but I look forward to continuing this project in the new year.

My writing took a turn for the better. I was re-reading “On Writing Well” during my trip to Boise and witnessed a local author speaking about his own writing. I was so moved by his speech that I made a laundry list of writing goals for myself. Although I did not accomplish all of my goals I have continue to write consistently and have been lucky enough to join a writing club in San Francisco. I look forward to really taking my writing to the next level in 2018.

I started a new job at the end of the year at LaunchDarkly. Working at CircleCI was honestly the best job that I have ever had. I am so grateful to everyone in that company that made my time there rewarding and special. LaunchDarkly is a small company with big plans for 2018. I can’t wait to be a part of those plans and watch the company grow over the next year.

I traveled to Uzbekistan for my brothers wedding. It was an amazing experience full of amazing people, delicious food, and a wonderful culture. I am so happy to see that my brother found love and I wish nothing but the best for him and his wife. I hope that in 2018 we will see more of each other and maybe even have a new nephew or niece? 🙂

I continued my relationship with Aosheng, we have been on many adventures together and we are starting off 2018 on an exciting note by traveling to China during the second week of January.

I began taking some classes at UC Berkeley Extension and have been really inspired by the community of professionals doing continuing education. I look forward to taking even more courses in 2018.

I started a handful of coding projects, gave up on more, rekindled others. Still searching for the next big idea, but having a great time along the way. I also set a goal to become a Python Core Developer. I didn’t reach it this year, but I hope to make some significant progress toward this goal in 2018.

I started going to more meetups toward the end of the year. It has been great to meet all sorts of new people doing exciting things. I look forward to continuing to be a part of the local tech community and perhaps even start giving talks of my own at various meetups around town.

All in all, 2017 was a great year. My main goal for 2018 is to successfully turn 30. In addition I want to write more, code more, listen more, read more, and travel just enough. 😉 I am wishing everyone a very Happy New Year. I hope that in 2018 all of your dreams come true.

 

A Painting Comes to Life

A painting captures a single moment in time. Unlike a photograph, we never know if the things in a painting actually existed in that moment. The objects, people, and scenery are all painted over a long time using fuzzy human memory. We are left with an impression of what might have been.

While wandering through art galleries I often wonder what was happening in the scenes. How were the people feeling? What were the sounds in the atmosphere? What was the weather like? Most of all; out of all the moments in an artists life, what made something worth painting?

In Loving Vincent by Dorota Kobiela and Hugh Welchman we get a unique perspective and potential answers to some of these questions. The worlds first oil painted animated movie takes us on a journey surrounding the circumstances of Vincent van Gogh’s death. In learning about his tragic death, we get a glimpse into pieces of his life.

Like many artists, Vincent van Gogh was a tortured soul. During his short career as an artist he created hundreds of oil paintings. Although his contemporaries considered him crazy and a failure, he achieved international acclaim after his death and is considered one of the most influential artists of modern times. We see evidence of this today since it is nearly impossible to visit any modern art museum without seeing at least one of his works.

Kobiela and Welchman bring a selection of van Gogh’s most famous works to life. The postmaster, woman at the piano, man in a yellow jacket, the paint seller, and many other famous paintings are transformed from still portraits into full characters with emotions, dreams, goals, and lives of their own.

The film is a work of art in and of itself. Even if you don’t care about fine art, or film, or animation, or the life of van Gogh. It is difficult to watch this film without a sense of appreciation for the six years of work, and hundreds of painters, that it took in order to produce the film. It will be difficult to view his work again without imagining the motion.

If you were not able to view this truly unique film in a theatre, Blue Ray and DVD version of Loving Vincent  will be available on January 16th.

 

 

 

 

Using Plex with Nextcloud

After hearing about it for years, I finally got around to installing plex on my nuc. I’m impressed with everything about Plex. It was easy to install, and mostly works out of the box. I am using it to manage my ever growing movie collection and massive music library.

All of my files were already on the nuc since I am using Nextcloud. Rather than duplicating the files, I pointed my media library to the same directory where my files are in my nextcloud installation.

This poses a couple of permissions problems. On Ubuntu, this directory is owned by the www-data (apache) user and group. In order to get plex to be able to see the files at all I had to add the plex user to the www-data group and then restart the plex service. The following commands will make that happen:

sudo usermod -aG www-data plex
sudo systemctl restart plexmediaserver.service

My biggest complaint with most “home media servers” is that once you point the files to the right place, you cannot really “manage” most of them. For instance, I have a massive (50+ GB) music collection that I have built up over the years. When I am listening on shuffle I want to prune out some of the songs that I hate. Luckily, with plex this is very simple. The only catch is that the www-data group needs to have read/write/execute access to those files.

In order to make this happen you can run the following command against your data file. Be sure to replace the directory I have below to whatever you are using for your own Nextcloud files.

chmod -R 775 /var/www/nextcloud/data/levlaz/files

Doing these two things makes the Plex + Nextcloud integration work very well. Now whenever I add or remove files from my many different computers everything stays in sync.

SQLite DB Migrations with PRAGMA user_version

This blog used a simple homegrown blogging engine that I wrote backed by a SQLite database. I have a function in the flask app that performs database migrations. My current approach has been to keep a folder full of migrations and run them sequentially whenever the app starts.

This works well for the case of adding and removing tables since SQLite has the handy IF NOT EXISTS option. However, when you are altering an existing table, this entire model falls apart since IF NOT EXISTS no longer works.

Practically, this means that outside of a fresh install my database migrations are useless.

I am still being stubborn and not using a well written solution like Alembic (which I would highly recommend for a “serious” project) for this blog. Instead, I discovered that SQLite comes with a built in mechanism to keep track of the user schema. This is the pragmastatement, and specifically user_version.

Using PRAGMA user_data for DB Migrations

My migrations folder structure looks like this:

.
├── blog.db
├── blog.py
├── __init__.py
├── migrations
│   ├── 0001_initial_schema.sql
│   ├── 0002_add_unique_index_to_posts_tags.sql
│   ├── 0003_add_fts.sql
│   ├── 0004_add_column_to_post.sql
│   ├── 0005_add_comments_table.sql
│   └── 0006_add_admin_flag_to_comments.sql

As you can see the naming convention is 000N_migration_description.sql. Each migration file has the following statement in it:

PRAGMA user_version=N; (where N is the 000"N" part of the file name)

This steps the current user_version to be equal to the current version as defined by the file name.

The code to do stuff with the database is shown below:

def connect_db():
    """Connects to Database."""
    rv = sqlite3.connect(
        app.config['DATABASE'],
        detect_types=sqlite3.PARSE_DECLTYPES | sqlite3.PARSE_COLNAMES)
    rv.row_factory = sqlite3.Row
    return rv


def get_db():
    """Opens new db connection if there is not an
    existing one for the current app ctx.
    """
    if not hasattr(g, 'sqlite_db'):
        g.sqlite_db = connect_db()
    return g.sqlite_db


def migrate_db():
    """Run database migrations."""

    def get_script_version(path):
        return int(path.split('_')[0].split('/')[1])

    db = get_db()
    current_version = db.cursor().execute('pragma user_version').fetchone()[0]

    directory = os.path.dirname(__file__)
    migrations_path = os.path.join(directory, 'migrations/')
    migration_files = list(os.listdir(migrations_path))
    for migration in sorted(migration_files):
        path = "migrations/{0}".format(migration)
        migration_version = get_script_version(path)

        if migration_version > current_version:
            print("applying migration {0}".format(migration_version))
            with app.open_resource(path, mode='r') as f:
                 db.cursor().executescript(f.read())
                 print("database now at version {0}".format(migration_version))
        else:
            print("migration {0} already applied".format(migration_version))

The relevant part to this blog post is the migrate_db() function. Two things are happening.

  1. The get_script_version() helper function extracts the integer from the migration name.
  2. current_version gets the current value of user_version of your database.
  3. We iterate over each migration file in the migrations folder and perform a simple check. If the migration version is larger than the current_version we run the migration, otherwise it gets skipped.

This solves for most cases and allows for a smooth upgrade path if anyone ever decides to start using this blogging engine for themselves. I am still pretty happy with this approach because this is essentially a fully functional migration system in just a handful of lines of python.