* Implemented ScriptRunner with environment expansion and cleaner code. * Added ApiClient plugin (com.noccy.apiclient) * Renamed CHANGELOG.md to VERSIONS.md * Shuffled buildtools * Added first unittests
3.1 KiB
Spark: Ignite your development workflow
Spark is a utility to help with everything from various mundane tasks to complex database migrations and project deployment.
Installation
Download
spark.phar and make it executable. If desired, alias spark=spark.phar.You may also want to alias sparksh='spark repl'.
Download the latest release from dev.noccylabs.info
and extract it into a directory somewhere, such as /tmp:
$ mkdir /tmp/spark; cd /tmp/spark; unzip ~/Downloads/spark-0.1.0-dist.zip`
Make sure spark.phar is executable, and run the installer. Then just follow the instructions:
$ chmod +x spark.phar
$ ./spark.phar install
Afterward you will be able to call directly on spark, as well as optionally
the sparkplug, sparker, sparkres and sparksh aliases. You can then
install any new plugins into ~/opt/spark/plugins and enable them in your
projects with sparkplug --enable the.plugin.name.
Using Spark
The easy way
To get started use the init command. You still need to edit the /.spark/docker.json
and other related files as required, but his is the easy way.
$ spark init
$ sparkplug --enable com.noccy.git
The useful details
Spark expects a configuration file to either be found at ./.spark.json or
./.spark/spark.json relative to the project root. The ./.spark directory
will always be used for auxillary configuration, so the placement is fully up
to you.
On its own it doesn't do much except provide a command interface to its inside. The magic can be found in preloading:
spark.json
{
"preload": [ "./.spark/plugins/*", "./.spark/autoload.php" ]
}
The preloader will go over each of the defined rules and attempt to load them in one of two ways, if applicable:
- Files with a
.php-extension will be loaded directly. - Directories having a
sparkplug.phpfile will be loaded as plugins.
The advantages of writing your extensions as flat files:
- Simple interface
- Quickly register resources for other parts of Spark
- All code evaluated on load (can be a caveat!)
The advantage of writing your extensions as plugins:
- Object-oriented interface
- Delayed evaluation of code, ensuring dependencies are loaded
- Free autoloader! The namespace and path of the plugin class will be used to set up a Psr-4 autoloader for your code.
Scripts
Using scripts is the simplest way to leverage Spark:
spark.json
{
...
"scripts": {
"hello": "./.spark/hello.php",
"world": "echo 'World'",
"greet": [
"@hello",
"@world"
]
}
}
- Call on other scripts by prepending
@to the script name. .php-files are executed in-process, and as such have access to any registered resources, resource types and plugins..pharfiles are still executed out-of-process, as are any commands that don't match a PHP callable or any other specific rule.- Substitute shell variables using
${varname}.
Resources
Resources are wrappers around database connections and such, providing a cleaner interface to its innards.
Resources are generally registered by plugins or local scripts.