Supercharge your development workflow! Spark is an extensible script runner, giving access to database connections, a web api client and much more.
Go to file
Christopher Vagnetoft 1125ccb82d Watcher plugin fixes
* com.noccy.watcher: Initial inotify support
* ScriptRunner now accepts an array of local vars for expansion
  when evaluating scripts
2021-12-15 03:47:39 +01:00
.spark Multiple fixes 2021-12-11 01:44:01 +01:00
bin Plugin manager, misc fixes 2021-12-09 00:12:37 +01:00
plugins Watcher plugin fixes 2021-12-15 03:47:39 +01:00
runtime Misc fixes and improvements 2021-12-14 23:01:25 +01:00
src Watcher plugin fixes 2021-12-15 03:47:39 +01:00
tests/Environment Multiple fixes 2021-12-11 01:44:01 +01:00
tools Multiple fixes 2021-12-11 01:44:01 +01:00
.gitignore Misc fixes and improvements 2021-12-14 23:01:25 +01:00
composer.json Multiple fixes 2021-12-11 01:44:01 +01:00
phpunit.xml Multiple fixes 2021-12-11 01:44:01 +01:00
README.md Multiple fixes 2021-12-11 01:44:01 +01:00
VERSIONS.md Multiple fixes 2021-12-11 01:44:01 +01:00

Spark: Ignite your development workflow

Spark is a utility to help with everything from various mundane tasks to complex database migrations and project deployment.

Installation

Download spark.phar and make it executable. If desired, alias spark=spark.phar. You may also want to alias sparksh='spark repl'.

Download the latest release from dev.noccylabs.info and extract it into a directory somewhere, such as /tmp:

$ mkdir /tmp/spark; cd /tmp/spark; unzip ~/Downloads/spark-0.1.0-dist.zip`

Make sure spark.phar is executable, and run the installer. Then just follow the instructions:

$ chmod +x spark.phar
$ ./spark.phar install

Afterward you will be able to call directly on spark, as well as optionally the sparkplug, sparker, sparkres and sparksh aliases. You can then install any new plugins into ~/opt/spark/plugins and enable them in your projects with sparkplug --enable the.plugin.name.

Using Spark

The easy way

To get started use the init command. You still need to edit the /.spark/docker.json and other related files as required, but his is the easy way.

$ spark init
$ sparkplug --enable com.noccy.git

The useful details

Spark expects a configuration file to either be found at ./.spark.json or ./.spark/spark.json relative to the project root. The ./.spark directory will always be used for auxillary configuration, so the placement is fully up to you.

On its own it doesn't do much except provide a command interface to its inside. The magic can be found in preloading:

spark.json

{
  "preload": [ "./.spark/plugins/*", "./.spark/autoload.php" ]
}

The preloader will go over each of the defined rules and attempt to load them in one of two ways, if applicable:

  1. Files with a .php-extension will be loaded directly.
  2. Directories having a sparkplug.php file will be loaded as plugins.

The advantages of writing your extensions as flat files:

  • Simple interface
  • Quickly register resources for other parts of Spark
  • All code evaluated on load (can be a caveat!)

The advantage of writing your extensions as plugins:

  • Object-oriented interface
  • Delayed evaluation of code, ensuring dependencies are loaded
  • Free autoloader! The namespace and path of the plugin class will be used to set up a Psr-4 autoloader for your code.

Scripts

Using scripts is the simplest way to leverage Spark:

spark.json

{
  ...
  "scripts": {
    "hello": "./.spark/hello.php",
    "world": "echo 'World'",
    "greet": [
      "@hello",
      "@world"
    ]
  }
}
  • Call on other scripts by prepending @ to the script name.
  • .php-files are executed in-process, and as such have access to any registered resources, resource types and plugins.
  • .phar files are still executed out-of-process, as are any commands that don't match a PHP callable or any other specific rule.
  • Substitute shell variables using ${varname}.

Resources

Resources are wrappers around database connections and such, providing a cleaner interface to its innards.

Resources are generally registered by plugins or local scripts.