# Spark: Ignite your development workflow Spark is a utility to help with everything from various mundane tasks to complex database migrations and project deployment. ## Installation Download `spark.phar` and make it executable. If desired, alias `spark=spark.phar`. You may also want to alias `sparksh='spark repl'`. ## Using Spark Spark expects a configuration file to either be found at `./.spark.json` or `./.spark/spark.json` relative to the project root. The `./.spark` directory will always be used for auxillary configuration, so the placement is fully up to you. On its own it doesn't do much except provide a command interface to its inside. The magic can be found in preloading: *spark.json* ``` { "preload": [ "./.spark/plugins/*", "./.spark/autoload.php" ] } ``` The preloader will go over each of the defined rules and attempt to load them in one of two ways, if applicable: 1. Files with a `.php`-extension will be loaded directly. 2. Directories having a `sparkplug.php` file will be loaded as plugins. The advantages of writing your extensions as flat files: - Simple interface - Quickly register resources for other parts of Spark - All code evaluated on load (can be a caveat!) The advantage of writing your extensions as plugins: - Object-oriented interface - Delayed evaluation of code, ensuring dependencies are loaded ### Scripts Using scripts is the simplest way to leverage Spark: *spark.json* ``` { ... "scripts": { "hello": "./.spark/hello.php", "world": "echo 'World'", "greet": [ "@hello", "@world" ] } } ``` `.php`-files are executed in-process, and as such have access to any registered resources, resource types and plugins. ### Resources Resources are wrappers around database connections and such, providing a cleaner interface to its innards.