Pages - Menu

Getting Started with Git 101

Scope

I have used many SCM in the past. VSS, SVN, TFS, ClearCase, Mercurial etc... There are so many of them yet they are so similar, so that they are not even worthy for a spot in a resume.

However, git was a little more challenging to me as their structures and architectures are different. I have now used git for just over a year now, and put the followings together that covered what I believe is a good starting point to learn git commands.

Technical

Config

I use posh-git and found the default dark red color was a bit hard to read against a dark background color in Windows console. I changed them to yellow and magenta by updating the ~/.git/config.

[color]
    ui = true 
[color "status"]
    changed = magenta bold
    untracked = yellow bold

Settings

# Change a global setting
$ git config --global --edit

# Change editor to notepad globally
$ git config --global core.editor notepad

# Setup git diff / merge tool globally
# for example, if we are using p4merge as a diff / merge tool.
$ git config --global diff.tool p4merge
$ git config --global merge.tool p4merge

# Git merge generates unwanted .orig file
$ git config --global mergetool.keepBackup false

Basic Changes

# Check pending check-in status
$ git status

# Get latest files
$ git pull

# Change branch
$ git checkout <branchName>

# Add files for pending check in
$ git add <filename>

# Undo a git add
$ git reset <filename>

# Delete files for pending check in
$ git rm <filename>

# Undo pending delete files 
$ git reset head <filename>

# Amend last commit
$ git commit --amend

# Undo commit
# This will reset the the branch to a previous commit 
$ git reset HEAD~

# Hard reset is a potentially dangerous command
# Changes are destructive and may not be recovered
$ git reset --hard <commit-id>

# a force push will force the origin to point to the same commit as local
$ git push origin HEAD --force

# Discard changes in working directory
$ git checkout <filename>

# Discard untracked files in working directory
# Double check what to delete
$ git clean -f -n

# The actual deleting
$ git clean -f 

# Discard untracked folders in working directory
$ git clean -df 

Stash


It is similar to shelve in TFS.

# All unstaged and staged dirty files will be "stashed", 
# and the working directory will be cleaned.
$ git stash

# shows the list of stash
$ git stash list

# shows content of stash
$ git stash show -p

# retrieve then remove changes from the stash 
$ git stash pop

# apply changes (and not removing) from the stash
$ git stash apply

# remove changes from the stash
$ git stash drop

Branch

# Delete a local branch
$ git branch -d <branchName>

# Delete a remote branch
$ git push origin :<branchName>

# Rename current branch
$ git branch -m <newname>

Merge

# Merge a branch from source to destination
$ git checkout destination-branch
$ git merge source-branch

# Resolve a merge conflict
$ git mergetool

# Resolve merge conflict with theirs or ours preference during a conflicted state.
# Take their changes
$ git checkout --theirs *
$ git add *

# Take our changes
$ git checkout --ours *
$ git add *

Tag



# listing tags
$ git tag
  
# add tag
$ git tag -a <tagName> -m "A message for tagging"
  
# push local tags to remote
$ git push origin --tags

# branch out from a tag
$ git checkout -b <newBranchName> <fromTag>

Rebase

# Rebase branch from parent branch
$ git rebase <parentBranch>

# Conflict during rebase
$ git rebase --[abort|skip|continue]

Fork and Submodule

# Add a remote repo to current repo as a subfolder
$ git submodule add <gitRepo>
  
# Get latest in submodule
$ git submodule update

Demandware - Bazaarvoice Cartridge Decoupling by Using Tealium and Commerce Connect

Scope

We are implementing a Bazaarvoice integration with our Demandware ecommerce platform. For general Demandware customers, that require installing and configuring the Demandware Bazaarvoice Cartridge from the Demandware Marketplace.

For us, we are slightly ahead in our game plan and we can do something a bit more advance. We use Tealium for our tag managment and Commerce Connect for our feed integration. Therefore, we can put our Bazaarvoice beacons in Tealium, and our product feed from Commerce Connect. Then our Demandware implementation became a simpler implementation with just the review containers and the submission isml template.

Technical

In a nutshell, we are implementing a bunch of Bazaarvoice products and we distributed some of the responsibilities to other products depending on what is required.


The general idea of the above is to leave the html changes in Demandware, put all javascripts in Tealium, and create feed jobs that run from CommerceConnect.

SEO is implemented within the Demandware cartridge.

Product Catalog Feed is moved to Commerce Connect - The same platform that we use to manage our eBay or Google feed. We can setup a new channel with the Bazaarvoice type.


Question and Answer / Ratings and Review are split into both Demandware and Tealium. The bvapi.js tag will go to Tealium, but we need to implement the html containers or initialize the inline ratings in Demandware.

ROI Beacon is basically a javascript call to do $BV.SI.trackTransactionPageView(). This is achieved via Tealium.

For Submission form, this is a piece of stand alone Bazaarvoice component for customer to submit reviews, so I leave everything in Demandware including the javascript. It is implemented in the Demandware cartridge as Bazaarvoice-Container pipeline.

We also made some UI changes to include our company header in the container. Container URL is done via Config Hub.


Conclusion

There were a few hurdles during this process, but as our ecommerce system grow and integrate with many vendors, it is essential to setup all this foundation correctly.

By decoupling some of the job responsibilities to other vendors, our ecommerce system can focus on strategy and planning, while leveraging our vendors to help us managing our tags, product feeds or product reviews.

Demandware - How to replicate from PIG to SIG

Scope

Typical scenario where we want to bring down a staging instance from Primary Instance Group (PIG) to one of our sandbox so we have up-to-date contents in our Secondary Instance Group (SIG). By default, Demandware does not have this functionalities and cannot be done out of the box.

Solution

One easy way to achieve this is by doing this.

  1. Go to Business Manager in PIG
  2. Administration >  Site Development >  Site Import & Export
  3. Export site and Save in Global Export Directory
  4. Optionally run dbinit in SIG via Control Center
  5. Go to Business Manager in SIG
  6. Administration >  Site Development >  Site Import & Export
  7. In the Import panel, the site backup will be available as global location.


Bitbucket Continuous Integration with Bitbucket Pipelines

Scope

We use Bitbucket as our SCM and other few Atlassian products in our development team. Happy to say that I am pleased with the tools that it made our daily works very productive and hassle free. 

Recently, I am looking into a CICD solution for our deployment process. I have already previously Setting Up Jenkins for GitHubSetting Up Octopus Deploy for Jenkins with nopCommerce Projects or Setup Continuous Integration with Visual Studio Online. I like the design of Octopus Deploy, but given we are now happily living with our Atlassian suite, I want to see what the Bitbucket Pipelines (formerly Bamboo) can offer me. 

I just signed up for the Bitbucket Pipelines Beta and already received the beta invitation within about 6 hours :)




Goal

  • Setup and integrate Bamboo with Bitbucket.
  • Explore possibility to automatically run the Demandware Grunt Build Suite when we commit to master.

Setup

I clicked the link to install the Bitbucket Pipelines add-on.
Went through a simple setup procedure and enabled it in my repository.


It adds a Pipelines link in my repository.

Going through the wiki to configure the bitbucket-pipelines yml.

It is essential to understand the Docker Image used by Bitbucket pipelines. The default image uses ubuntu and is good enough in our scenario.

In the setting tab, we are able to setup environment variables for username and password.


We created a bitbucket-pipelines.yml script in the root folder.



Check-ins to the master will trigger the Bamboo to run the build job and script.



Thoughts

I have not had much chance to explore further, but it looks quite promising and does what I wanted. I like the fact that it is tightly integrated with our bitbucket so I do not need to login to another service, the setup was simple and straightforward.

"Build servers build" - Bitbucket is a CI server and not a CICD server. We will leverage Bitbucket Pipelines for our continuous integration and our Demandware Business Manager Code Replication for deployment purpose. I am pleased with the Bamboo Pipelines, this should work well and fits our CICD strategy.



Demandware - Implementing 3rd party FTP Service for XML Drop

Scope

I developed a few Demandware cartridges that export xml via ftp for 3rd party integration. A few things that I have done such as:-


  • Piggy back a ftp drop task to the current job schedule that imports xml from a difference integration point
  • Using standard Demandware pipelet to generate a xml feed for ftp drop
  • Create a custom script node to generate a xml feed for ftp drop
It wasn't too difficult as a job, but I found a few hidden gotchas that I believe are noteworthy.


We are dealing with the latest Demandware version 16.7.

Technical

FTP Constraints

  • Ftp connection is established using passive transfer mode (PASV) only.
  • Demandware only support ftp and sftp for backend integration, no ftps.
  • Use FTPClient for ftp; and SFTPClient for sftp.

ArrayList

In our code, we use dw.util.ArrayList to filter out products for export. We ran into an error where the ArrayList cannot be bigger than 20,000 in size.

Front End Debugging

I was testing out a pipeline-startnode by direct url from a broswer, it did not turn out well because some of the pipelets can only be executed from the backend. The error message looks like this.

com.demandware.core.quota.QuotaExceededException:
Limit for quota 'api.pipelet.ImportExport' exceeded. Limit is 0, actual is 1.

Job Schedule

We tried to schedule a job for our cartridge and learnt in a hard way that JS Controller is not supported for Business Manager Job Schedule. Only pipeplines can be used.

Traversing XML

If you happened in need of traversaing the xml, Demandware uses ECMA scripts for XML. Here is a quick start guide.

Firewall

Besides the firewall rules that I needed to manage with our external vendor, we also need to request outbound firewall rule from Demandware.

As they pointed out in their doc, "Before you can make an outbound FTP connection, the FTP server IP address must be enabled for outbound traffic at the Demandware firewall for your POD. Please file a support request to request a new firewall rule".

From Demandware, there are steps to request a Firewall rule to allow outgoing connections to 3rd party services.

Demandware - Monogram by using Product Options

Scenario

Our business wants to start selling personalized product by monogramming. In this particular promotion, customer gets a free monogrammed luggage tag when they buy a bag.

Technical

In Demandware, this can be achieved by using Product Options. This is the recommended way to deal with monogram in Demandware. As they stated in their document, "product options enable you to sell configurable products that have optional accessories, upgrades...".

We can set up product options by going into Business Manager. Merchant Tools -> Products and Catalogs -> Product Options.

Create a New Option and put in relevant details.


In our scenario, we have allowed up to 3 initials for our monogram. Due to Demandware limitations and our UI design, we decided to store the values in 3 different product options. Each option stores A-Z and a blank, it can vary base on your technical and UI requirement.


We will then create the System Object Attribute to store the Product Options.

In Administration -> Site Development -> System Object Types, we will add new attributes for Product.



Result

Monogramming options can be set at per color level. Therefore, in our product detail page, we are now able to sell our chocolate color bag with monogramming options but not our black color bag.


And the monogram information is available in our mini cart, cart, checkout and all the way through to our order summary page, as well as the order confirmation email that we sent to our customers.

Conclusion

There were a few discussions, considerations and research done within our team when we were planning for the monogram features for our products. Once we read enough Demandware documentations, and with a few try and error in our sandbox, we got our solution. We are happy with this implementations because it uses the native Demandware features rather than putting custom code in our cartridge.

Demandware - Get Active and Upcoming Promotions

Scope

We are trying to get a collection of promotions that are either active or upcoming. We want to pre-calculate all our promotional price in advance for our affiliate feeds, so they will get the most updated price without frequent feeds from us.

At the time of writing, we are using the latest DemandWare 16.7 API.

Technical

In the DemandWare API, there is no such thing about getting active and upcoming promotions. Ideally we would like to return a collection combining the following.

PromotionMgr.getActivePromotions();
PromotionMgr.getUpcomingPromotions(1000);

Since these 2 methods return an object PromotionPlan and there is no supporting methods such as promotionPlanA.Add(promotionPlanB), so we will need to write a bit more code to achieve what we want.

function getActiveAndUpcomingPromotions() : Collection
{
 var promos = PromotionMgr.getPromotions().iterator();
 
 var result : ArrayList = new ArrayList();
 var now : Date = new Date();
 
 while (promos.hasNext())
 {
  var promo = promos.next();
  
  if (promo.active 
  || (promo.startDate != null 
    && promo.startDate > now 
    && (promo.endDate == null || promo.endDate > now)))
  {
   result.push(promo);
  }
 }
 
 return result;
}

Notes

  • Depending on number of promotions in the system, this can be an expensive operation to loop through all the promotions, so it is recommended to use local caching for the collection.
  • Our custom code is better than the original PromotionMgr.getUpcomingPromotions(previewTime : Number) because it was not possible to retrieve ALL future promotions where the previewTime is compulsory.
  • Our custom code only return Collection of Promotion, this is not the same as a PromotionPlan.
  • Beware of the ArraySize Limit 20,000 in DemandWare. Pushing more than 20,000 active promotions to the array will throw exception.