Saturday, March 14, 2015

Node.js goes pro: New opportunities -- and risks

In its mere five years of existence, Node.js has transformed from a technological curiosity to a technology stack all its own, providing a major building block for everything from microservices to APIs.
The better part of the rise is due to the ecosystem of tools, development environments, and hosting services that has evolved around Node.js in response to the need to make existing development tools Node.js-friendly (such as Visual Studio) and to provide Node.js with the kind of professional-level support and service it requires.
But tooling specific to the needs of Node.js apps gives a more granular view of the health of the Node.js ecosystem, showing both how far Node.js has come and how far it still has to go. High-profile shifts away from Node.js, including the recent introduction of a fork, magnify not only the limitations of the Node.js ecosystem but also the direction in which the ecosystem must evolve.
Here is a look at the future of Node.js as seen in developments emerging today.

Development environments: Uneven -- and looking beyond the traditional IDE

Not surprisingly, JavaScript development environments have long revolved around authoring and debugging JavaScript as a client-side affair. Though the JavaScript consoles in Chrome and Firefox provide an REPL interface, among other debugging tools, they are mainly aimed at satisfying the needs of client, not server, developers.
For those looking to use JavaScript beyond the client, intriguing authoring and debugging tools are emerging. A project from StrongLoop, makers of a stack of Node.js dev tools, will allow experienced developers to leverage front-end JavaScript debugging tools for Node.js debugging on the back end. Node.js Inspector plugs Chrome’s Developer Tools into a running Node.js instance, enabling its inspection, breakpoint, step-through, source-file navigation, and REPL functionality to work directly with Node.js apps. It doesn’t yet perform profiling, though, and while there is another project in the works to satisfy that need (Node.js Webkit Agent), it too is largely incomplete, and only profiles heap and CPU usage for Node.js apps.
When it comes to IDEs, the picture is a little rosier. For one, most every professional IDE has not only JavaScript support but Node.js support; here, Visual Studio and Eclipse lead the pack. While they clearly favor Microsoft’s ecosystem, the Node.js tools for Visual Studio, for example, now feature local and remote debugging, NPM support, and many other functions needed by Node.js developers.
Outside of the conventional IDE sandbox, interesting things are happening with Node.js. Consider IBM’s Node-RED, billed as “a visual tool for wiring the Internet of things.” With Node-RED, code is wired together in flows, or visual diagrams that show how “hardware devices, APIs, and online services” are connected. Obviously the programming isn’t done with flows alone; a copy of Eclipse is embedded within Node-RED for writing JavaScript. It isn’t likely that all Node.js apps could be created or edited this way, but it’s one example of how Node.js is inspiring new approaches to development tooling that looks beyond the conventional IDE.

Hosting: Competition spurs support and innovation in the cloud

Before Node.js began making serious headway, the only way to run Node.js was to spin it up on bare metal that you owned. That time has long since passed, and cloud providers are now climbing over each other to provide Node.js hosting -- not only support for Node.js in VMs, but full-blown PaaS hosting for Node.js.
Most every brand-name PaaS now has Node.js support, and those that got an early start have gone to some lengths to bolster their support. Heroku, for instance, not only lets you deploy Node.js apps, but you can take your pick of which version of Node.js or NPM to use (including newer, not necessarily supported versions). Amazon, on the other hand, has used the pretext of PaaS-like Node.js support to create an entirely new kind of functional programming serviceAWS Lambda. Because Node.js and JavaScript are event-driven, Amazon reasoned, why not create a mini-stack for Node.js that runs code in response to events piped into it from the rest of AWS?
Amazon already had experience hitching Node.js to its services by way of an SDK that allows calls to AWS through Node.js, but AWS Lambda took an even bigger leap into Node.js territory. It’s unlikely this highly focused use of JavaScript will produce the demand enjoyed by the Node.js stack, but Lambda is intriguing. Amazon is clearly interested in what other fruit this approach can bear, as it has tentative plans to add support for other languages to Lambda down the road.
Another key change in the way Node.js works with hosts has come with the advent of Docker, the red-hot app containerization technology. Docker provides an easy way to bundle the Node.js runtime with its code, data, and any other associated applications, meaning any dependencies required by the application -- including the specific version of Node needed for it -- don’t have to be supported by the host. Docker also provides convenient ways to create Node.js apps and scale them (such as via the open source Deis PaaS). And an NPM package for Docker, dnt, allows Docker to be used to test code against multiple versions of Node.js in parallel.
 
Given these developments, Node hosting options will likely proliferate going forward. Here, containerization is key, as it makes it possible for developers to run Node.js on host services without the host even supporting the application’s Node runtime of choice. But as seen in Amazon’s Lambda, support for Node.js can also reap rewards for hosting providers looking to leverage the Node API to build new services and products.

Testing and debugging: The Node.js Achilles’ heel

Here, one of the biggest limitations is the way the debugging API, present in the V8 engine at the core of Node.js, has consistently lagged behind V8 itself. Ugur Kadakal of Nubisa, makers of the JXcore variant of Node.js, noted that while the Promises feature of JavaScript has been supported in V8 for some time, the V8 debugger still fails when they are used.
Joyent, by way of the Joyent Private Cloud and Joyent Compute Service, has developed its own solutions to the issues with Node.js debugging. The most painful of those issues is memory consumption.
“Historically,” says Bryan Cantrill, chief technology officer at Joyent, “we have little insight into how memory is used in dynamic environments.”
To that end, Joyent included inside-out Node.js debugging support in its platform by leveraging the DTrace functionality of Joyent’s Solaris-derived SmartOS, on which the platform is built. The bad news is that anyone who wants to use the same toolset needs to run SmartOS, either on their own or via Joyent’s cloud. Cantrill didn’t rule out the possibility the debugging technology could be ported to other platforms, but admitted it has a lot of dependencies on SmartOS that would need to be resolved.
Testing frameworks represent another area for possible improvement. Bowery, creator of a cloud development-environment system, built the first version of its service in Node.js, but eventually switched to Go for a variety of reasons. Among them was the fact some testing frameworks for Node.js “worked better for front end, like Jasmine, and others were better for the backend, like Mocha.” With Go, they reasoned, testing is built-in and standardized across the board; Node.js could benefit from having a testing framework of similar robustness.
One fairly high-profile exit from the world of Node.js development was spurred by the existing limitations on debugging and developing Node.js applications. TJ Holowaychuk, creator of Koa, Express, and the Node.js-canvas project, penned an essay in June 2014 wherein he bid farewell, albeit fondly, to Node.js development and its environment that “favours performance over usability and robustness.” Debugging and error handling, especially for callbacks -- one of Node.js’s core behaviors --struck him as grossly underdeveloped.
To Holowaychuk, Node.js is a worthy and powerful project, but “performance means nothing if your application is frail, difficult to debug, refactor and develop.” Holowaychuk expressed confidence that these problems could be overcome in time.
In theory, an open source project like Node.js should develop by leaps and bounds, with problems like the ones outlined here attacked in short order. In practice, though, parts of Node.js haven’t evolved as quickly as others. Aside from debugging and inspection, concerns about the pace of Node development have arisen, as evidenced in release cycle issues over the past year.
Joyent's plans to address this became clear only recently. Version 0.12 of Node was finally delivered in February 2015 with improvements in the above vein, and a separate Node.js Foundation is being set up to move governance to a disinterested third party.
Before that, however, others took their own steps. A fork of the Node.js project, named io.js, came into being as a way to address debugging, slow release cycles, and governance, among other issues.
Other plans include adding tracing functions to Node.js on Linux, and using a more recent build of the JavaScript V8 engine to “integrate with the latest debugging tools Google has made for Chrome and take advantage of other work they’ve done to improve debugging.”
“Debugging has always been challenging in an async environment,” said Rogers, “but we’re making headway.”
Io.js is still young, although a few existing Node.js deployments have begun using it (one engineer at Uber has apparently already done so at scale). But whether those advances and more like them come from Io.js or Node.js itself, it's clear that changes are badly needed for Node.js to flourish even more.
[This article was edited to clarify the use of Io.js by other companies.]
This story, "Node.js goes pro: New opportunities -- and risks" was originally published by InfoWorld.

No comments:

Post a Comment