M2 Project Source Code Management Recommendation

In this post I describe the proposed Magento 2 recommendation for managing the source code of a Magento 2 project. Proposed, because this is my unofficial blog; recommendation, because there is no requirement to follow this approach. But I have some more blogs coming shortly that assume this approach.

tl;dr: We ship a ‘.gitignore’ file with Magento that ignores ‘vendor’ and various other miscellaneous files used by popular IDEs. Use this and git commit everything else Magento installs to your project directory. Then use git tools to merge any local customizations with changes introduced by Magento patches or upgrades.

This blog assumes you have a development environment (e.g. on your laptop) where you make changes and then commit code to a git repository. Other team members can then checkout that code and make further modifications, as per normal with git. The production site is built from the code in the git repository.

The.gitignore File

When you create a new project, Magento creates a ‘.gitignore’ file. We may tune this file over time, but it is set up based on what we recommend committing to a project source code directory. For example, it ignores the ‘vendor’ directory. Local customizations specific to that project go under ‘app/code’ (modules) and ‘app/design’ (themes); code downloaded via Composer (and hence not “owned” by that project) do not belong with the rest of the project source code.

Put more simply, source code should be managed in source code management system which represents the “source of truth” of that code. Committing the ‘vendor’ directory is “wrong” because the source of truth is not the current project.

You can adjust the provided ‘.gitignore’ file to add extra directories. For example, ‘/node_modules’ will hold NodeJS downloaded libraries if you use Grunt, so frontend developers may wish to add this directory to the project’s ‘.gitignore’ file.

From Dev to Production

There are multiple possible paths to get code from a local development box to production.

  1. Run the compilation/static view file deployment phases on your development box, then copy the results to your production server (e.g. using ftp or scp).
  2. Commit local source file changes to your git repository, and have another process that runs automated tests and on success builds a production image. The production image is then deployed to production.
  3. Same as the previous step, except the production compilation phase is run on your production box.

Running the compilation phase on the production box is the least effort, but assumes you can put the store into maintenance mode during the compilation. (This is an area future investigation to improve as the compilation phase is slower than it should be.) For mid to large sites where tolerance for outages during deployment is reduced or non-existent (global brands have no off-peak time), you would most likely go with option 2 above where compilation does not happen on the production server.

Local Compilation Process

If performing compilation on your development box, perform the following steps:

  1. Run ‘magento deploy:mode:set production’.
  2. Zip/tar/etc the entire directory to take to your production server. (There are some gotcha’s here in that the env.php file contains environment specific information such as database connection details which may be different in dev and prod environments.)
  3. Then set your development box back to developer mode so you can continue development.

Triggered Build Process

If using git commit to trigger a CICD run of automated tests or other automated deployment, followed by a build process:

  1. Use a GitHub or similar web hook to trigger the running of tests when a git push is made to the git repository.
  2. git clone the source code repository to an empty, clean, directory.
  3. Edit the ‘composer.json’ file to add in the “extra” section “magento-deploystrategy”: “none”. This stops the following step overriding any local changes to index.php or similar files from the Magento “base” package. For example you can use:
     sed -i.bak 's/"extra": {/"extra": {"magento-deploystrategy": "none",/' composer.json
  4. Run ‘composer install’ to download all standard modules.
  5. Run ‘magento deploy:mode:set production’. This triggers static view asset deployment and dependency injection compilation automatically.
  6. Run automated tests on the source code.
  7. The final tree is ready to be coped to production.
  8. Delete the directory if you want to, so guaranteed to start clean next time.

Why was the “magento-deploystrategy” clause required? Because by default if Magento releases a new version of the “magento-base” package, composer plugins will run the Magento installer to replace whatever file on disk exists with the new version from the Magento distribution.

If you have added the files to git, then installing any patches to the “magento-base” package will show up in a “git diff” (for those files copied from the “magento-base” package to outside of the ‘vendor’ directory). Such modified files would be reported as a change to be committed, allowing a developer to easily review the changes and decide what to keep or discard.

Installing a Patch

Installing a patch works by the developer updating the ‘composer.json’ file to reflect the new patch level to download, running ‘composer update’ to get the latest version of files, then using ‘git diff’ and similar tools to work out what local changes may be being overridden by applying the patch.

For example, if you have made a local change to ‘index.php’ doing an upgrade will write the default Magento version of ‘index.php’ over the top of your local file. So after installing a patch, do a ‘git diff’ to work out what has changed and reapply an local customizations you have made to files such as ‘index.php’ before committing any changes.

Note: Only the ‘base’ package can contain conflicts, and it will only occur in the small number of files locally modified. The ‘git diff’ or similar command will report any such local changes, so they will be easy to spot.

Over time, as future Magento patches and releases migrate files from outside the vendor directory to inside (a strategic goal), you will notice files being deleted from your git repository. Review such changes, git commit the deletes of files you have not locally modified, and do not commit files deletions where you have local changes you want to keep.

The end result is there will be no further conflicts once the migration of the remaining ‘base’ package files to ‘vendor’ is complete. The above will work until we get there.

Which files should I locally modify?

There are some files we anticipate being locally modified, and some files that we strongly recommend you do not modify locally.

Examples of files anticipated to be modified more frequently:

  • json – this file will need updating to the latest patch version.
  • php – it is common to have minor modifications to this file. The file is short, merging any conflicts should be quick.
  • .htaccess – there may be designed changes to ‘.htaccess’ files.
  • dev/tests – if you want to add additional non-unit-test tests, this is where such changes would be made.

Examples of files recommended against modification:

  • etc/di.xml – do not modify this file locally. All entries in this file can be overridden by creating a local module.

Locally developed modules and themes will reside under ‘app/code’ and ‘app/design’, and so be managed by the git repository for the project.

Production, Developer, and Default Modes

A final note, I personally do not recommend using “default” mode where you apply changes directly to your production instance. It is officially supported, but I prefer keeping my source code under version control. For example, if you want to try out a new extension add it to a “development” environment, try it out on your laptop, and if not acceptable you can just throw any source code changes away. You have not touched your production site. It also means you can lock down the production site in read-only mode more tightly.

PART II

The above said what to do with not much rationale. The following explains the why.

Some History

First, a bit of history. Magento 1 had a particular directory structure where files of modules were not all located under one directory – different files for a module were located in different places. Magento 2 has improved modularity over Magento 1, including adopting Composer as its standard for installation of modules. This includes all files of a module (or theme) residing under a single directory under ‘vendor’, as recommended by Composer.

It is not uncommon practice in Magento 1 to modify core files, so effectively all source code is a part of the project. Magento 2 is stamping out this practice, so that

  1. all files local to a project sit outside ‘vendor’,
  2. all files local to a third party extension sit within the package(s) of that extension, and
  3. all core files sit within the packages of the core platform.

Magento 2 cleaned up many of the areas here (e.g. modules and themes are all packages now), but there was too much to complete in the Magento framework in time for the M2 release, so it was decided to finish this work over subsequent releases. But the final target is getting all files provided and patched by Magento under the ‘vendor’ directory.

(If you use Grunt (or Gulp) for front end development, you need to also install NodeJS and supporting libraries. These libraries are downloaded and stored in a ‘node_modules’ directory. As such, you may wish to add the ‘node_modules’ directory to the ‘.gitignore’ file as well.)

Why is this Important?

One significant problem with the M1 practice of editing core files is it forces patching to be a manual process – a human has to review all local modifications that have been made to core files and re-merge them with patched or new releases of files provided by Magento. This makes Magento 1 more expensive for merchants to patch.

M2 Test Automation

One area of improvement that has been completed in M2 is that unit tests have been moved into module directories – any module can declare its own unit tests which will be run when the module is loaded. However, functional tests (using Selenium) have not yet been restructured in this way. This is planned in a future release. That is, the plan is for any extension to be able to contribute additional functional tests to be automatically run with the overall set of functional tests.

This is a bit tricky however as functional tests depend on which modules have been loaded – so a functional test either needs to be flexible enough to work with any combination of modules loaded, or the test may only work with a specific set of modules loaded. Care also needs to be taken so that the functional test does not add additional unwanted dependencies for the module. It’s a bit tricky to get right, so we are not rushing the process but welcome any external great minds to think about the best way forward here. The short term solution is if a project wishes to customize functional tests, it can do so by locally modifying the Magento provided tests.

Is testing such an important issue? Yes! Magento 1 had no automated tests. Magento 2 has made a significant investment in test automation for a number of reasons, including the drive towards simplifying the installation of patches and performing upgrades. The more test automation, the lower the cost to Merchants in applying patches. In Magento 1, patching was expensive for Merchants due to the manual labor. In Magento 2, through test automation, installing a patch has greatly reduced manual labor as running the automated tests will test for you most of the important business flows.

The goal is to make test automation the norm, not something only high end sites do. To achieve this, running tests needs to be simple even for non-technical merchants. Merchants need an easy flow to install an extension/patch, run the tests, and then deploy the result to production. M2 has the bones of a solution, but more work is required to make it “easy”.

Simplified Patching

Magento 2 due to code restructuring and adoption of Composer makes it easier to install patches. Easy patch installation in turn improves site security. If a security patch is released, merchants can adopt patches sooner with Magento 2 with greater confidence.

This implies that extension developers should be encouraged to also develop automated tests for their extensions, so upgrading those extensions will also be easier and cheaper for merchants.

Great Plans, What is Already Done?

Is all this effort worth it in the short term then? Has M2 really improved things already? Yes. M2 is significantly better than M1 already because the only area where local changes may conflict with official Magento code is the Magento “base” package. No local changes need to be made to core modules – M2 provides numerous tools to override core functionality in a more controlled way. This means the manual effort to review patch changes is already drastically reduced in M2. This benefit may not be felt yet as Magento has not released many patches yet. But M2 has been designed to simplify the patch process.

Conclusions

This was a long post with lots of rambling background information. The key points however are simple.

  • If you use ‘composer create-project’ I recommend you git commit the resultant directory tree. A decent ‘.gitingore’ file will be provided for you. That is how to manage the source code.
  • With this approach, you may need to do some manual merging of “base” package files (not modules) until we eliminate the last remaining files that need such copying. Git makes this straightforward to do.
  • You do need to pick the approach you want to follow going from development to production.
  • I have at least one upcoming blog post on simplified development environments compatible with this approach to project source code management.

There are other options for source code management. The approach proposed here is close to the end game. Local files outside of ‘vendor’ will be under git source code management. There will be a little bit of chunkiness for sites changing files like ‘index.php’ until the migration is complete, but it’s not that bad in practice. The practices above line up with the end goal after the cleanup.

21 comments

  1. Hi Alan, thank you clarifying some of the process around source control.

    There is one area that wasn’t touched on, which is how to manage a patch delivered as a .patch file and applied using the patch command. If a patch is applied to core files in the vendor/magento directory, those patched files are ignored based on the .gitignore file and never committed back to the codebase. Committing the vendor directory is a potential resolution, however is generally viewed as not best practice.

    In addition, a new patch release may not include the changes previously applied though a patch file (as we have seen in 2.0.1 – 2.0.3). This appears to potentially lead to divergent code within modules, potentially having to reapply the patch to the new release source. If many patches have been applied to a codebase this process of reconciling the new release with any applied patches will become difficult to support over time.

    Are there any recommendations on how to manage this scenario?

    1. We are moving towards only shipping patches via Composer. But you are free to add your own extra entries to .gitignore files yourself.

    2. Sorry, re-reading, we did not release .patch files for 2.0.1 to 2.0.3 – you mentioned some files “were missing from the .patch file”. Could you expand? I am not sure what this refers to.

      1. twolaver · ·

        I’m referring to patch files delivered in response to a client support request and patch files under the vendor/magento directory.

        We have seen patches delivered for 2.0.0 which are not included in any of the patch releases (2.0.1 – 2.0.3)

      2. Let me chase up internally with the client support team. Long term goal is to stop doing that – but we don’t have release process fast enough yet. They should be included in official patches

  2. Interesting read but I can’t agree completely.

    Git is not really a code management system it’s a version control system. Github UI (revisions, diffing, issues what else) can be a code management system , git behind it is still version control system.

    You don’t manage the “source of truth” with a vcs but you manage “change”. If you deploy something you are deploying the “change” not a “source of truth”. Viewed from project context then m2 core or any of the stuff hidden away in vendor folder is not a “source of truth” but merely a dependency. Certain version of “change” (clean installation for example) and after that you’ll add your own versions of “change” beside it (by not modifying the dependencies or core). You should validate all dependencies beforehand if it is really from or if it is real “source of truth” but after that it becomes a “change” or dependency in your project context.

    Adding .gitignore to be “core” file inside M2 repo is wrong for several reasons:

    * this is a configuration file that can be environment/project specific and developers need to manage this on their own without conflicting with origin
    * current .gitignore version is Magento2 team environment specific or their development flow specific config file .
    * you try to decide what is important on behalf of the end user or define what is their deployment flow .
    * as you pointed out the “installing a patch” section that composer upgrade will rewrite index.php etc or any core file (including .gitignore).

    This means that all those files that come with the installation are “core” files, you never suppose to change those directly (now this unfortunately includes .gitignore and going around that is painful). If you do then you can expect conflicts and manual diffing on each update (as you already stated) and in each env you are about to update. If you don’t edit any core files directly you can upgrade , patch the core code left and right without any conflicts or even little diffing. M1 (even if different folder structure) can be managed the same way, clear core and no conflicts on patching and flawless upgrading.

    Quote: “Put more simply, source code should be managed in source code management system which represents the “source of truth” of that code. Committing the ‘vendor’ directory is “wrong” because the source of truth is not the current project”

    In your project context (as a developer or code owner) you are not after managing the “source of truth” of a single dependency but you are managing a “change” inside your project. To make this possible you need to control and validate all that a single version of “change” includes. You can do this by versioning all dependencies separately as well and all but at the end it really does not matter how a part of code ends up in your project (manually added, composer etc, on build time, even getting hacked) it is still a change.

    Even if you think that validating the “vendor” should be managed within a different flow (As it is hidden by default) it is still important part that every code owner needs to handle (and have a vote of their own how exactly. To ignore or not). If .gitignore ignores “vendor” then it could also ignore any other Magento core file that you think devs should not edit and take as a “source of truth” in all levels . You probably don’t agree with this last bit as for m2 dev team this is “your project” your “change” and you can’t ignore the change you need to manage it . To understand this better try to see it from a dev project context where m2 is a dependency (like in every m2 website).

    1. Thanks for the detailed feedback!

      “Adding .gitignore to be “core” file inside M2 repo is wrong for several reasons:” – I want to make sure I understand this statement. A merchant creating a site will do a “composer create-project” which sets up a skeleton project. The merchant can make changes (e.g. add an extension to the composer.json file) and then commit that project to git. When you refer to “M2 repo” above do you mean the merchant’s project repository or the Magento GitHub repository? (They are quite different things.) We expect the merchant to fiddle the file as desired – we provide a starting point.

      We know there issues with what we have today and are moving to fix them. Composer update re-overwrites some of the skeleton files in the merchant project area (which is undesirable). There are also too many files in the skeleton. Many of the files actually are core files and should never be touched. Others are starting-point files and should be written on create-project, but never touched again afterwards.

      I understand that the way git is built is it pushes change sets around, but at the end of the day what you execute on a web server is a particular snapshot of the entire code base. One approach is to commit the whole directory tree, the other is to commit the instructions to build the whole directory tree. The blog post recommends the latter, relying on Composer to reassemble the site identically each time (specially the composer.json file). (This is actually a risk when you have dependencies of the form “2.*” as repeating the build instructions will result in different projects.) Some people will want to mandate committing *everything* to avoid this problem. That is their choice and I understand the reasoning. This post recommends using specific version numbers rather than version number ranges and relying on composer as the source of truth for non-project packages – that is, rely on composer to reliably return the same package each time for the same version number.

      Much of this is probably saying the same thing as you with different words. I think the problem you are mentioning with .gitignore I am agreeing with. We should create it on “composer create-project” then never touch it again. I suspect “composer update” is touching it again, and it should not. That is not a core file – it should be a starting skeleton project file only. Until we have all such cases fixed, there is a bit more manual effort during patches. But I am recommending people build projects today the way it should be once these issues are fixed.

      1. I meant the merchant project repo. If you are a git user you probably will fork the m2 repo and base all your projects on your fork for easy rebasing and bringing in updates. This is also one possible way of installation described in dev docs . Right now .gitignore is a part of codebase and means more work (As you have a conflict as soon as this is changed) for anyone going that route .

        Even if you go all composer then generating this once is fine but updating env specific file later is not fine. For almost all other files (index.php , .htaccess etc) you can serve untouched without hassle and do this by default anyway but .gitignore is especially nasty since overriding this is more complex

        the second part: ignoring “vendor” meant that your “source of truth” repo.magento.com was just down = “all builds depending on it fail” if not managing the “change” as whole . For other who do (don’t ignore vendor and store all deps ) builds were not failing but just not able to verify if “source of truth” has new changed knowledge for them (until source is back up).

      2. Ah, I got the source of confusion now. You don’t fork github for projects – you fork if you want to submit a CE bug fix. Otherwise you download with composer and get patches by composer. You get extensions by composer too.

        The outage today on the repo highlighted the internal need for more repo management tools. We should never take the repo down – ever. We will be doing postmortem to ensure we work out how to avoid the problem again.

      3. If you separate the two offered installation methods that one is for contributing dev’s and other is for regular folks then what is the difference 🙂 is the code, tagged for a release different in public github repo and repo.magento.com or will it be?

        Not all extensions will use marketplace (think of all in-house made or site specific extensions) and you definitely won’t be .gitignoring the extension code you get with composer either (just another dependency). I also believe that there will be more merchant controlled composer channels we can handle or access.

        Even if the m2 internal need doubles, triples or looses its importance around the repo.magento.com then this is not the only obstacle around of “source of trough” for M2 projects as the outage/obstacle might just be on some other level out of your control (issues in client network, level of secrecy or security etc etc). The need for a client to manage their own “source of truth” or in other words a version of some “change” never goes away.

        I agree that each dev/merchant will choose their own route and most of the installation will depend (even in live sites) on repo.magento.com.

        If at some time in the future .gitignore will not be “core” file it would certainly make my day 🙂 as then dev can decide (more easily) what is important in their project .

      4. Advanced users may be able to work out how to manage and upgrade a project reliably via GitHub clones. I am not sure I could do it personally however! The initial install is easy. It is doing a “composer update” where you get a collection of new extensions purchased from the Marketplace, core patches, etc. Composer worries about dependency and compatibility management. It makes sure you get a mix of packages that are compatible – or stops you if they are not. Using git clone rather than composer you lose some of those benefits.

        If you write a module local to your project, then put it in app/code. That project owns it. It will not be git ignored. If your company decides to create its own internal repository of packages, then you would install them via composer install and they would go under vendor (and hence gitignored). As the module is shared between projects, the code does not belong to that project. Instead you would put it in a separate git repo (with separate version management independent of the projects) and manage it just like any other composer package, using composer version management again.

        For issues around accessibility to the M2 repo, yes – we need to keep it up! But serious organizations can (1) use local composer caches and (2) Toran proxy can help. I have not tried these myself, but there are solutions to these problems. I have heard of one customer basically cloning their M2 repo access to an internal repo – saves Internet traffic. They periodically mirror the M2 repo (if up). I think that works too.

        “If at some time in the future .gitignore will not be “core” file it would certainly make my day 🙂 as then dev can decide (more easily) what is important in their project.” — Yes, absolutely. But you can do this today. You don’t have to wait. It is just a little awkward until we get it fixed completely. But it’s not that bad. That was the message I was trying to get across in the post.

      5. there’s nothing advanced in this . Install once , get all the dependencies, record the version and further deploy down the stream (installations, tests, dev envs) to all others without depending on “source of truth” in each move. Once it is set up you can stream deltas of “change” as your stored version of “change” is your “source of truth” for your project. There are huge savings in time , bandwidth when you handle the “change” instead of managing the code and depending on “source of truth” like described in your post.

        Even if you think that code belongs to some other repo (and this is also true in terms of separate dependency management side) then when dealing with the actual m2 website project (or any project) all things you have downloaded (via git, composer, what else) and put together as a project are belonging to this project from this point as dependencies (it can’t work if something is missing). Sure you can update those dependencies via composer or other channels as “code belongs to somewhere else” but it is still merely a dependency and you never change the core and can be handled in one place with some other pipeline. Therefore after you have put together your own project you only have to visit the “source of truth” when there are updates available and you don’t need to introduce the “source of truth” to every step of your deployment (create a proxy of this and so on as you own the version of some dependency anyway) and depend on it’s outages or ever being available further down your development pipeline.

        anyway everyone is free to choose their weapons here , m2 way described here is bit overcomplicated and can be just eased out by choosing different strategy on handling dependencies as “change” not as “code downloaded via Composer (and hence not “owned” by that project) do not belong with the rest of the project source code.” management.

        Thanks for the discussion tho it has been enlightening. The only reason I wrote this is that m2 forces their way by including .gitignore to it’s repo and overwriting it without conflicts is possible but definitely harder than just this not being committed there. I can handle this in both ways and will only applaud if this gets removed from source control at some point in the future.

  3. Hello Alan,

    Thank you so much for all those detailed posts. This is so awesome.

    I just try to figure out a good deployment process and tried your triggered build instructions. After cloning the repository (https://github.com/tschifftner/magento2-sample-repository) and running composer update I do not have the Magento CLI options “deploy” at all. Magento is not installed and this might be the reason. But at this stage all I want is a full build archive that is moved to multiple paralell test steps. Any hint to achive this?

    1. It depends what you are trying to achieve, but that repo is our “sample data” – for loading into an installation. If you just want to play with a store, have a look at https://alankent.me/gsd – there is a Docker based install with Luma and the sample data all preloaded. Very easy to get going. All the code and tools installed for you.
      I actually have some other blogs coming (hopefully this week) talking about development experiences with GoDaddy and similar – turning these individual topics into a complete working set of steps for development. Bitnami have a M2 VM if you want to run it locally.
      But this is an area we are trying to improve right now. We want to remove some of the friction to get things going for a range of developers. So keep an eye out on my blog over the next few days, then please let me know which resonate. We do want to pick a “recommended approach” and get that merged into devdocs.magento.com as a simplified install experience.

  4. Hello Alan,

    I think you got me wrong here. Installation is not the problem. I try to figure out the deployment process within an Jenkins instance where I don’t want the full shop to be installed but to build a tar archive with all pulled in composer modules.

    This is working great except that there is no option to set production mode (Magento CLI) to compile it and generate the frontend assets. So I would have to run that command on all test instances before running the actual tests.

    I think the missing piece is the database + configuration but thats something I would like to avoid at this step.

    1. Ah, sorry. I see. I think we might be experiencing a similar problem ourselves at the moment if I understand. The static asset generation code requires a database to be available (due to themes), but in Jenkins there is no database server.

      If I have it right, a possible solution we are considering is to drop support for DB overrides for themes and treat them more like modules (that is, they are “files on disk” only). But we have not finalized this decision. (Opinions welcome by the way.)

      That leaves the option of having a DB available in the Jenkins environment for the build process to use (probably the best short term option), or run that phase before Jenkins (e.g. git commit the generated files). We are also considering having a compile mode flag to turn off the code checking the database for themes.

  5. mmenozzi · · Reply

    Hi Alan, thank you for this post. I think it’s a very important topic.

    I totally agree with your point of view about source code management and deployment process. We should all keep in mind that there’s a popular T-shirt with the claim “Did you just edit the core?” to emphasize that this is a bad practice. This happens because with M1 the most common approach is to put core files under version control. So to anyone who wants to continue with this bad practice I say: please, stop it! Don’t put under version control code that you don’t own, don’t put under version control core files and third party modules/themes.

    I want also to say that in a previous comment you said: “This is actually a risk when you have dependencies of the form 2.* as repeating build instructions will result in different projects”. This is not true. If you put under version control the composer.lock file (and you should do so) Composer will download the same set of packages in every environments every time the build process runs with that composer.lock file; this even with dependencies of the form 2.*.

    So please keep pushing with this approach, it’s very appreciated!

    1. Thx for comment. Yes, you are right – committing composer.lock does solve the 2.* problem. Thx for feedback!

  6. Thanks for the article, it was very helpful.

    I am looking at a solution where I deploy using Ansible and a package called Ansistrano (combination of Ansible and Capistrano).

    It is working fairly well, but the one part I am unsure about is how to enable modules through automated deployment.

    I can add a command to my ansible script that manually calls the modules I need to enable and change it each time. Just wondering if there is any other way, such as a file I can maintain that has all my enabled modules (without manually editing config.php).

    So I guess, a manually editable file indicating enabled Magento modules similarly to how composer.json acts for php packages.

    Thanks.

    1. I don’t have computer handy but I think there is a module:enable or similar command. Ortherwise I normally commit config.pho which lists enabled modules. (Or have I missed the point?)

  7. committing config.php makes sense. I didn’t see it committed in the .gitignore file recommended so it was confusing. Thanks!

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.