This is the Julia track, one of the many tracks on exercism.io.
It holds all the exercises that are currently implemented and available for students to complete.
The track consists of various core exercises, the ones a student must complete, and each core exercise may unlock various side exercises.
You can find this in the config.json
.
It's not uncommon that people discover incorrect implementations of certain tests, have a suggestion for a track specific hint to aid the student on the Julia specifics, report missing edge cases, factual errors, logical errors, and, implement exercises or develop new exercises.
We welcome contributions of all sorts and sizes, from reporting issues to submitting patches, as well as joining the current discussions 💬.
This guide covers several common scenarios pertaining to improving the Julia track. There are several other guides about contributing to other parts of the Exercism ecosystem, that are similar to this repository.
There are track-independent guides on how to use git and GitHub available:
Help us keep Exercism welcoming. Please read and abide by the Code of Conduct.
Before contributing code to any existing exercise or any new exercise, please have a thorough look at the current exercises and dive into open issues.
There are two ways to implement new exercises.
- Pick one from the list of exercises
- Create a new, track-specific exercise from scratch.
Let's say you want to implement a new exercise, from the list of exercises, because you've noticed that this track could benefit from this exercise, really liked it in another track, or just because you find this interesting; the first step is to check for an open issue. If it's there, make sure no one is working on it, and most of all that there is not an open Pull Request towards this exercise.
If there is no such issue, you may open one. The baseline of work is as follows:
- Open a new issue, we'll label it with
new exercise ✨
- We'll assign the issue to you, so you get to work on this exercise
- Create a new folder
exercises/$slug
$slug
refers to the exercise slug, the machine-readable name that is listed in the list of exercises
- Create a
$slug.jl
stub file. - Create a
runtests.jl
test file. Here add the tests, per canonical data if possible. - Create a
example.jl
file. Place a working implementation, assuming it's renamed to$slug.jl
- The example solution is meant to show that the test suite works properly. It doesn't have to be an ideal or optimised solution!
- Run the tests locally, by running
julia runtests.jl $slug
from the root of the repository. - Add the exercise to
config.json
. You can generate the entry interactively by runningjulia bin/new-exercise.jl
and answering the prompts. If you're unsure about the difficulty, progression or topics, just guess and we will discuss it in the Pull Request.- You may have to install the
configlet
first, if you haven't already.
- You may have to install the
Instead of manually creating a runtests.jl
file, you may also add a generator script that takes the canonical-data.json
for the exercise as input and generates a test suite from it.
Eventually we will create a framework for this but for now, individual scripts are fine.
The final step is opening a Pull Request, with these items all checked off. Make sure the tests pass and the config linter doesn't complain. They will run automatically on your PR.
The steps for a track-specific exercise are similar to those of implementing an established, existing exercise. The differences are:
- You'll have to write a README.md and test-suite from scratch
- You'll have to come up with a unique slug.
- We need to require an icon for it. (optional)
- Generate a UUID, for example using configlet.
Open a new issue with your proposal, and we'll make sure all these steps are correctly taken. Don't worry! You're not alone in this.
After following the steps above, your exercise should contain at least the following files:
README.md
- The problem description and other information that is presented to the student. Generated usingconfiglet
.example.jl
- Contains an example solution that proves the test suite works.$slug.ipynb
- The Jupyter notebook version of the exercise. Generated usingbin/generate-notebooks.jl
.$slug.jl
- The file that the student will write their solution in. May contain stubs or be empty depending on the exercise.runtests.jl
- The test suite for the exercise. Contains a standardized comment referring to the canonical data version and tests. It must onlyinclude
$slug.jl
, notexample.jl
.
Further, an entry in config.json
was added for the exercise.
It may contain further files, e.g. to add additional information to the README. This is the bare minimum.
Take a look at the exercise/
directory or commit history for examples, or at this example of what a commit adding a new exercise should look like.
There are always improvements possible on existing exercises.
README.md
: the description that shows up on the student's exercise page, when they are ready to start.
It's also downloaded as part of the exercise's data.
The README.md
, together with the runtests.jl
file form the contract for the implementation of the exercise.
No test should force a specific implementation, no README.md
explanation should give away a certain implementation.
The README.md
files are generated, which is explains here.
- This file may need to be regenerated in order to sync with the latest canonical data.
- You may contribute track specific
hints.md
, as listed in that document - You may improve the track specific
exercise-readme-insert.md
, and regenerate all the readmes.
Syncing an exercise with canonical data: There is a problem-specifications repository that holds test data in a standardised format. These tests are occasionally fixed, improved, added, removed or otherwise changed. Each change also changes the version of that canonical data. Syncing an exercise consists of:
- updating or adding the
version
comment on top of theruntests.jl
file, - updating the
runtests.jl
file, - match the
example.jl
file to still work with the new tests, and - regenerate the
README.md
, should there be any changes.
Mentor notes are the notes that are given to the mentors to guide them with mentoring.
These notes do not live in this repository, but instead in the website-copy
repository.
Take a look at their contributing guidelines.
There is quite a bit of student-facing documentation, which can be found in the docs
folder.
You may improve these files by making the required changes and opening a new Pull Request.
You'll need Julia 1.0 or higher in order to contribute to the code in this respository.
If you'd like to download configlet, you can use the fetch-configlet
scripts.
They will run on Linux, Mac OSX and Windows, and download configlet
to your local drive.
Find more information about configlet here.
We have various scripts for you in order to aid with maintaining and contributing to this repository.
You can find them in the bin/
directory.