The value of a breakpoint

Roughly one year since enrolling, I remain more or less engaged in possibilities availed by a casual online course. What evolved from fleeting curiosity to potential portfolio item has become one of the most time-consuming projects of my life. Reasons are manifold, but most prominent I consider to be compatibility with my condition. Where prolonging a single process can trigger difficulties due to obstructed pathways, I am able to maintain productivity by switching tasks (creating, augmenting, or fixing either artwork or logic). While I don’t yet feel whole in that respect, positive strides strengthen my hope that I someday will.

Recently, I have become overzealous and not as careful as I had been early on about maintaining balance, which typically begins on a racing train of thought. Believing that I have struck a solution, I focus my energy on completion of a singular task. Degree of focus inversely correlates to inhibition, the absence of which, when on a suboptimal path, breeds compulsion. Fixated on the daily grind, I more easily forget essential points like why something is being done, or even tending to personal health which recursively impacts decision making. An overlooked opportunity for improvement results in more work if I am fortunate to realize my oversight later. This cycle continues until I can no longer ignore it at which point I may feel exasperated surveying the impact.

As an example, I have shut out the idea of further relevant coursework and feedback, preferring the peril of reinventing a lesser wheel than risk my enthusiasm for the subject. Ignorance of predefined constructs encourages me to continue drilling inward for inspiration where I might otherwise be tempted to lean on the crutch of conventional wisdom. This is helpful to the extent that, so long as I strike something, even if it isn’t gold, I gain a clue of where to look inside for a solution next time. Sometimes however, I settle for something much less than gold, and push ahead without looking back for the sake of doing more, and not necessarily better, work. Putting in hours makes me feel good about myself, but coming to grips with the extenuating complications has compelled me to take a step back and consider wrapping up the project; effectively calling it quits.

I found that the action of letting go told the compulsion that it could no longer dictate my workflow, and after taking some time away to recenter, was able to open my project but was now looking at it from a different perspective. No longer hampered by tunnel vision, I was able to view my work from a higher level and consider several constructs that were blocking progression. After a few days of refactoring and sensing diminishing value in my output and waning energy, I stepped back and repeated the process with similar results.

I believe a key for me to return to my previous level of productivity without sacrificing quality is to know precisely when I begin conceding to compulsion and losing perspective. Each step in progression yields a slew of considerations that mandate restraint until sufficient consideration can be applied. A more practical solution may be to set intermittent timed periods for surveying completed work and newly generated considerations.

Stumbling into art

To unearth a proclivity for programming, I did not need to stray far from the beaten path, but the small amount that I did bolstered my confidence to repeat the process; with mixed results.

Before I began studying and while drafting a specification encapsulating my [questionable?] ideas for a health informatics solution, I created a website as a sandbox intended to prototype some of those ideas but was instead used to essentially roll around in aimlessly while making a mess. That mess, after tinkering enough with and wanting to customize the frontend, included some of my first logo designs, which initially served as a wellspring of self-deprecating humor but eventually seemed to improve enough to the point that I no longer felt ashamed of them.

After finally learning to program and wanting to add a project to my portfolio, I took an intro course on and, being riveted, immediately began building a video game, which was similar to the earlier experience in that I should have studied the subject further before beginning an initial production, but was dissimilar in that I did not and should have prototyped my ideas in a sandbox in order to better grasp the project requirements and construct a satisfactory code base to prevent constant refactoring.

Blunders aside, my earlier experience translated to the latter when it came to creating sprites and particularly those sprites that represented icons, which, discounting my illustrations that still need a lot of work, I felt satisfied enough with to try my hand at making game music. My experience with music comprised, since my youth, listening and humming along to a variety of genres as well as classes that I overlooked as respites from “serious” coursework, and was limited in that I had always had a guitar lying around but, owing in large part to a stiff fretting hand, rarely attempted to play until recently.

Prior to beginning game development, I had generated one or two of my own riffs, but afterward began experimenting a lot more to where I now have created over forty. During that process I learned to make chiptunes using a synthesizer app, which now availed to me chords that I could not play previously due to the limitations of the physical medium. It was not until I had significantly progressed with music before I decided to try my hand at full-scale level design which, like my other artistic exploits, I lacked concentrated training and study in, but had enough experience with to develop a sense of taste.

I began with a loose goal of recapturing the sense of exploration I experienced playing 8-bit games as a kid and an initial approach of trial and error. Each time a design improved, I augmented my approach and experimented to see if I could arrive at a better design, and followed the pattern. I consider levels to be the least abstract of illustrations, logos, sound effects and music, in that order, so perhaps I should not be surprised that my level of dissatisfaction with my creations follows the same order, where research into predefined constructs might have more significantly impacted quality.

In either visual, auditory or spacial art, I feel the practice of one translates to the other because the same mental process is engaged; a process comprising experimentation with the arrangement of sensory inputs in order to satisfy the consciousness on a visceral level. To that extent, it becomes necessary for the artist to be attuned to their own consciousness in order to be able to discern what does and does not feel right. For me, this is very much a work in progress.

Communicating in code

At the outset of my programming study, I was taken by the ability to translate my thoughts into reality by talking to a machine in the right syntax. Every time it was instructed to run my program and did just that, apart from the elation of pleasant surprise, I felt as if I was forging a relationship with a new friend. So long as my intentions were communicated clearly (and my computer remained bug-free), it would always understand.

Being that its success relies on a thought process that is logical and transparent, communicating in code provides a social safe harbor to those for whom human sociopathy does not compute. It always says what it means and does not try to obfuscate its intentions. Words cannot take on double meanings because the user defines them. Interaction, once phraseology is overcome, flows in the pattern of mathematical properties down to the binary level where logic begins to merge with the physical world. If I fail to dictate properly, I am only challenged to reconfigure my internal speech processing filter to generate results with greater relevance and specificity. This result is completely within the user’s control and not based on arbitrary constructs, making feedback useful not only for coding but other aspects of life as well.

Its cheerful acceptance of and constructive responses to my input, whether valid or otherwise, and lack of irrational judgment have, on the other hand, enabled me to explore and extrapolate ideas from the recesses of my consciousness with confidence and without fear of reprisal. The more I delve into this process, the more intuitive I feel it becomes. Whereas earlier on I was more concerned with structuring my syntax to the liking of the compiler, developing a game using a full-featured engine has empowered me to turn the bulk of my attention to finding a sweet spot and bringing it to life. That, and debugging.

Exploring possibilities

During my last college course, I mapped Java wrapper classes with Hibernate over a Java database connection (JDBC) to a MySQL relational database implementation fetching data from the web through a connection servlet interfacing through a Java Server Page (JSP) and executing from a Tomcat web server, with unit tests based on mock database and server implementations.  The application was functional and incorporated features that could be expanded on to incorporate some differentiable value, but implementing a modern GUI would have at that point entailed using Javascript which from my brief encounters was not very pleasant to work with.

Seeking out alternatives, I enrolled in two separate development courses on Udacity: Android and LibGDX development. I was able to complete the latter along with its successor before the former, then looked to expand on the course project to differentiate my own abilities in a portfolio sample to be shared with potential employers.  Initially, I thought I would add and update a few assets to go along with a menu and some levels, then polish the code base and documentation.  Being absorbed in the creative process, polish was left to the dust, and project scope expanded to accommodate a growing list of ideas I thought could make the game better.

Nine months later, with intermittent downtime due to some data science coursework as well as an unrelated side project, I can say that I enjoy game development and would like to explore the subject further.  At the same time, I am shielding myself from any advice for two reasons.  One is to prevent foregoing lessons in challenging myself to generate better answers on my own.  The other is that, with my limited experience, I think it would be helpful to delineate my thoughts working through problems on my own, where I can always accept and outline input received, as well as my responses, toward the end of the project.

I realize that I am straying from, especially in the context of high-performance game development, best practices and common sense measures that, the sooner sought, are critical to meeting goals in time-sensitive and/or collaborative work environments. On the other hand, I have been fortunate to more liberally consider different approaches to solving a problem, including constructs that could be broadly applied in similar cases to achieve a desired end result. Though I am probably just reinventing a lesser wheel, doing this now I believe will make feedback more resonant and appreciable in light of the additional insight.

The path of a programmer

My first programming course relied on the text, C Programming by Stephen Kochan. Lessons covered, early on, basic library features and data structures, and later, memory management and linked list implementations, while throughout explaining corresponding lower-level interactions like those between the compiled code and the machine RAM and CPU. I appreciated the simplicity of procedural coding but due to its constraints did not believe I could create a modern interface application from scratch within a reasonable timeframe. After the course ended, I continued to add features to my final project and soberly considered rewriting it to incorporate object-oriented features. Reading about the advantages of other high-level languages, which were designed with those features in mind, thankfully convinced me to scrap the idea and enroll in a C++ course.

In C, I had wanted to create a structure comprised of variables and functions by which a variable of the structure’s type could be defined and assigned a reference to a memory location storing an instance of the structure with the same composition but different state from other instances. C++ cleanly supports this feature, with the structures referred to as classes and the instances as objects, while going steps further. A class could inherit, override aspects of and add additional definitions to other class compositions with a simple command, by which objects of the class could also be identified as instances of the inherited type, enabling collections to store, conditions to check for and loops to iterate through objects of different classes. Being implemented atop and maintaining all of the functionality of C, the language enabled me to apply previous lessons to the context of a modern language which would prove helpful in my next course on Java.

For example, I was better able to understand the interaction between machine code and the Java Virtual Machine which addresses the portability concern of compiled languages. While I fell in love with flexibility, power and safety of C++, I was not thrilled with Java in its adherence to, by default, passing parameters by reference which is rarely desirable due to the ability of client code to corrupt an object’s internals, forced automated garbage collection which disposes of the myriad benefits of custom terminator implementation, and the lack of true multiple inheritance that, well, I think that’s probably saying enough. Fortunately, Joshua Bloch’s Effective Java text, though long in the tooth, was to me still very effective at bridging the gap between code design in C++ and Java, and convinced me to stick with a language that due to its unmatched support is still a valuable asset in any developer’s toolbox. After implementing a database application with a web interface and network connectivity, I felt empowered in how much could be accomplished with one language alone.

Studying and gaining familiarity with C, C++ and Java provided me with a learning roadmap that I felt could be applied to any language whenever the next opportunity arose. Rather than concentrating on a specific language, I shifted my focus to gaining insight and experience into other aspects of the development process.

Overcoming inertia

I might have learned programming years earlier were I not resistant to the idea of giving up on my past.

I had volunteered and worked long hours as a laborer to build character and save money for school where I forfeited a social life to overload on coursework while playing sports as an unrecruited and undersized walk-on. Soon after graduating and starting a new role, I was diagnosed with a condition that forced me to turn my attention toward my health. I asked not to be compensated for down time and tried to be as unobtrusive and helpful as possible. Despite not being retained, I left on good terms and continued to look for work while studying for and passing exams in the area of my previous field.

As I also became more aware of the struggles other people faced when dealing with medical conditions, I viewed software as a way to help them maintain their standard and quality of living. Not being able to find a job in my previous field forced me to consider my interest as more than a hobby but also a possible link to a compatible career path for both myself as well as employers. Now feeling closer to and sensing a viable pathway to that goal, I can reflect back on the ebbs and flows that have led me to this point and be that much more grateful to everyone and everything that has been a part of it.

I no longer feel like I am giving up on my past, but by considering its lessons and following my heart, am allowing it to guide me to where I should be.

Getting acquainted with code

Around the time I was brainstorming software ideas that led me to draft a specification, being devoid of coding experience and unsure of where to start, I published a website with WordPress to begin thinking about the development process.  Realistically, I knew that the backend aspect of my idea would require hard-coding but, by seeing how far I could progress using only resources within arms reach, hoped to better appreciate the requirements for my particular goals by which I could begin surveying options. What started as primarily installing and configuring existing plugins and widgets led me to begin examining and editing code snippets through trial and error.

The site did not end up incorporating any of the core functionality of the app that I had in mind. Even if independently-developed and fully-integrated API together achieved all functionality sought, not understanding how they worked individually would result in significant downtime whenever an issue arose while I got myself up to speed.  I would be learning, but under the untenable condition of being forced to treat each issue, devoid of insight to help gauge severity, as a crisis.  I knew that I did not want to risk staying in a perpetual state of improvising with an inferior skill set and needed to work up the courage to commit to a new path.

I was always curious about but at that point felt the concrete need to delve into the study of programming.  Desiring a lower-level foundation, I decided that learning about, without actually learning, assembly and machine language was enough for me, and drew the starting line at the C language.  I picked up a few books on the subject (one, by Perry and Miller, and the other, by Kochan), completed the first and read a few chapters into the second before enrolling in courses.  For an aspiring programmer, I would recommend this route by default.  Once certain foundational knowledge is acquired, independent learning opportunities abound.

A lengthy learning experience

Before I started programming, I kept a running log of ideas that, after talking to a professor, led me to begin a provisional patent application.   Researching and writing the specification on my own took me over a year at a rate of about a dozen hours, or what usually ended up as one all-night frenzy, per month.  I was not able to afford an attorney to help with substantive matters so the advice I received was limited to searching prior art and properly formatting and filing a nonprovisional application.

In hindsight I wished I had not jumped so eagerly at the first suggestion to begin what is for most inventors a years long, expensive and grueling process that rarely ends well.  Had I properly calculated the risk of failure versus the actual benefit to be gained at the outset, I may not have filed that initial application.   On the other hand, it was as good of a primer for an aspiring programmer as I can think of.   It propelled me past the inertia of my fears to pursue a subject I was passionate about down to my core.

Here is a link to that specification:  click to view