Supercharge your development workflow! Spark is an extensible script runner, giving access to database connections, a web api client and much more.
Go to file
Christopher Vagnetoft 0f45219747 Misc fixes
* Updated build scripts to handle gitless environments a little
  better
* PDO shell plugin improvements
* More tests
2021-12-17 12:51:29 +01:00
.spark Misc fixes 2021-12-17 12:51:29 +01:00
bin Plugin manager, misc fixes 2021-12-09 00:12:37 +01:00
plugins Misc fixes 2021-12-17 12:51:29 +01:00
runtime Misc fixes and improvements 2021-12-14 23:01:25 +01:00
src Misc fixes 2021-12-17 12:51:29 +01:00
tests Misc fixes 2021-12-17 12:51:29 +01:00
tools Multiple fixes 2021-12-11 01:44:01 +01:00
.gitignore Misc fixes and improvements 2021-12-14 23:01:25 +01:00
composer.json Multiple fixes 2021-12-11 01:44:01 +01:00
phpunit.xml Multiple fixes 2021-12-11 01:44:01 +01:00
README.md Multiple fixes 2021-12-16 16:01:17 +01:00
VERSIONS.md Misc fixes 2021-12-17 12:51:29 +01:00

Spark: Ignite your development workflow

Spark is a utility to help with everything from various mundane tasks to complex database migrations and project deployment.

Installation

System Requirements:

  • PHP 8.0 or later (php-cli)
  • Linux or other POSIX compatible OS. Probably. May work on MacOS!

From dist package

Download the latest dist release from dev.noccylabs.info and extract it into a directory somewhere, such as /tmp:

$ mkdir /tmp/spark; cd /tmp/spark; unzip ~/Downloads/spark-0.1.0-dist.zip`

Make sure spark.phar is executable, and run the installer. Then just follow the instructions:

$ chmod +x spark.phar
$ ./spark.phar install

Afterward you will be able to call directly on spark, as well as optionally the sparkplug, sparker, sparkres and sparksh aliases. You can then install any new plugins into ~/opt/spark/plugins and enable them in your projects with sparkplug --enable the.plugin.name.

From installer

Download the latest installer release (the one that ends in .run) from dev.noccylabs.info and make it executable:

$ chmod +x spark-0.1.0-dist.run
$ ./spark-0.1.0-dist.run

Follow the instructions, select Yes when prompted to proceed with the installation.

From source

Download the latest source release from dev.noccylabs.info and extract it into a directory somewhere, such as ~/src/spark. You can then build spark, using spark:

$ unzip -d ~/src/spark spark-0.1.0-src.zip
$ cd ~/src/spark
$ bin/spark run build

You can now install spark.phar where desired, and place the plugins directory in a good place. You want to add the following to your .bashrc or similar:

export SPARK_PLUGINS="<path-to-plugins-dir>"
# If you don't want to rename the .phar for some reason. Skip otherwise!
alias spark=spark.phar
# Useful aliases
alias sparksh=spark repl
alias sparkplug=spark plugins
alias sparkpipe=spark pipe

Using Spark

The easy way

To get started use the init command. You still need to edit the /.spark/docker.json and other related files as required, but his is the easy way.

$ spark init
$ sparkplug --enable com.noccy.git

The useful details

Spark expects a configuration file to either be found at ./.spark.json or ./.spark/spark.json relative to the project root. The ./.spark directory will always be used for auxillary configuration, so the placement is fully up to you.

On its own it doesn't do much except provide a command interface to its inside. The magic can be found in preloading:

spark.json

{
  "preload": [ "./.spark/plugins/*", "./.spark/autoload.php" ]
}

The preloader will go over each of the defined rules and attempt to load them in one of two ways, if applicable:

  1. Files with a .php-extension will be loaded directly.
  2. Directories having a sparkplug.php file will be loaded as plugins.

The advantages of writing your extensions as flat files:

  • Simple interface
  • Quickly register resources for other parts of Spark
  • All code evaluated on load (can be a caveat!)

The advantage of writing your extensions as plugins:

  • Object-oriented interface
  • Delayed evaluation of code, ensuring dependencies are loaded
  • Free autoloader! The namespace and path of the plugin class will be used to set up a Psr-4 autoloader for your code.

Scripts

Using scripts is the simplest way to leverage Spark:

spark.json

{
  ...
  "scripts": {
    "hello": "./.spark/hello.php",
    "world": "echo 'World'",
    "greet": [
      "@hello",
      "@world"
    ]
  }
}
  • Call on other scripts by prepending @ to the script name.
  • .php-files are executed in-process, and as such have access to any registered resources, resource types and plugins.
  • .phar files are still executed out-of-process, as are any commands that don't match a PHP callable or any other specific rule.
  • Substitute shell variables using ${varname}.

Resources

Resources are wrappers around database connections and such, providing a cleaner interface to its innards.

Resources are generally registered by plugins or local scripts.