* Fix broken .set command * Prevent crash if database not available
Spark: Ignite your development workflow
Spark is a utility to help with everything from various mundane tasks to complex database migrations and project deployment.
Installation
System Requirements:
- PHP 8.0 or later (php-cli)
- Linux or other POSIX compatible OS. Probably. May work on MacOS!
From dist package
Download the latest dist release from dev.noccylabs.info
and extract it into a directory somewhere, such as /tmp:
$ mkdir /tmp/spark; cd /tmp/spark; unzip ~/Downloads/spark-0.1.0-dist.zip`
Make sure spark.phar is executable, and run the installer. Then just follow the instructions:
$ chmod +x spark.phar
$ ./spark.phar install
Afterward you will be able to call directly on spark, as well as optionally
the sparkplug, sparker, sparkres and sparksh aliases. You can then
install any new plugins into ~/opt/spark/plugins and enable them in your
projects with sparkplug --enable the.plugin.name.
From installer
Download the latest installer release (the one that ends in .run)
from dev.noccylabs.info
and make it executable:
$ chmod +x spark-0.1.0-dist.run
$ ./spark-0.1.0-dist.run
Follow the instructions, select Yes when prompted to proceed with the installation.
From source
Download the latest source release from dev.noccylabs.info
and extract it into a directory somewhere, such as ~/src/spark. You can then build spark, using spark:
$ unzip -d ~/src/spark spark-0.1.0-src.zip
$ cd ~/src/spark
$ bin/spark run build
You can now install spark.phar where desired, and place the plugins directory in
a good place. You want to add the following to your .bashrc or similar:
export SPARK_PLUGINS="<path-to-plugins-dir>"
# If you don't want to rename the .phar for some reason. Skip otherwise!
alias spark=spark.phar
# Useful aliases
alias sparksh=spark repl
alias sparkplug=spark plugins
alias sparkpipe=spark pipe
Using Spark
The easy way
To get started use the init command. You still need to edit the /.spark/docker.json
and other related files as required, but his is the easy way.
$ spark init
$ sparkplug --enable com.noccy.git
The useful details
Spark expects a configuration file to either be found at ./.spark.json or
./.spark/spark.json relative to the project root. The ./.spark directory
will always be used for auxillary configuration, so the placement is fully up
to you.
On its own it doesn't do much except provide a command interface to its inside. The magic can be found in preloading:
spark.json
{
"preload": [ "./.spark/plugins/*", "./.spark/autoload.php" ]
}
The preloader will go over each of the defined rules and attempt to load them in one of two ways, if applicable:
- Files with a
.php-extension will be loaded directly. - Directories having a
sparkplug.phpfile will be loaded as plugins.
The advantages of writing your extensions as flat files:
- Simple interface
- Quickly register resources for other parts of Spark
- All code evaluated on load (can be a caveat!)
The advantage of writing your extensions as plugins:
- Object-oriented interface
- Delayed evaluation of code, ensuring dependencies are loaded
- Free autoloader! The namespace and path of the plugin class will be used to set up a Psr-4 autoloader for your code.
Scripts
Using scripts is the simplest way to leverage Spark:
spark.json
{
...
"scripts": {
"hello": "./.spark/hello.php",
"world": "echo 'World'",
"greet": [
"@hello",
"@world"
]
}
}
- Call on other scripts by prepending
@to the script name. .php-files are executed in-process, and as such have access to any registered resources, resource types and plugins..pharfiles are still executed out-of-process, as are any commands that don't match a PHP callable or any other specific rule.- Substitute shell variables using
${varname}.
Resources
Resources are wrappers around database connections and such, providing a cleaner interface to its innards.
Resources are generally registered by plugins or local scripts.