Pages - Menu

Demandware - Ajax Fetch Form Post in React

Scope

Previously, we did a mini project for Demandware - Converting Multi-steps Checkout to One Page Checkout

Ajax post was done by using jQuery as that is the standard javascript framework in Demandware. As we are migrating from jQuery to React, we no longer have the luxury to use $.ajax().


Technical

In our ES6/Javascript, we will use fetch to perform our form post. In React/Redux, we will do this in our action.

By not using jQuery framework, we also no longer be able to use the .serialize() to serialize the Form object. This will need to be done by Vanilla JS and I have found a library just do exactly that. :)

Code


import serialize from 'form-serialize'

export const postFormAjax = (actionUrl) => {
 
 var myForm = document.getElementById('my-form')
 var data = serialize(myForm)
 
 return dispatch => {
 
  var init = {
   credentials: 'same-origin',
   method: 'POST',
   headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
   body: data
  }

  return fetch(actionUrl, init)
   .then(response => console.log(response))
 }
}

Note

  • In the above code, I passed in the actionUrl to my javascript from my Demandware isml by URLUtils.continueURL().toString().
  • We need to manually set the content-type header, otherwise pdict.CurrentHttpParameterMap will be empty.
  • I was trying to use FormData, but I couldn't get it to work with latest Chrome and Demandware. I ended up with bunch of WebKitFormBoundary string that was not usable.



Result

I implemented the above in React and I am able to parse form data correctly in Demandware.



In our checkout page, I can dispatch actions to disable/enable the correct panels while performing ajax post back in the background.


Fetch then Promise by Google Geocoder Example

Scope

Our store locator supports searching by post code or suburb names.


However, our Demandware instance only have stores information by post code. In order to retrieve store information by suburbs, firstly we make an ajax call to the Google Maps Geocoding API to find out the post code by suburb. Then, use this information to search for stores.

All of the above are javascript driven by React dispatches. In our console log, we found that the code is fetching the getStoreUrl before the Google API callback our site which is incorrect. The timeline looks like this.


Technical

The Goal: After we called geoCoder(), we need to wait for Google API callback() before call getStoreUrl().

To achieve this, we will wrap the Google API call within a Promise object. For simplicity, I stripped out some of our custom logic and our code looks like this.

export const geoCoderPromise = (address) => {
 console.log ('geoCoderPromise() called.')
 
 var geocoder = new google.maps.Geocoder();
 
 return new Promise(function(resolve, reject) {
  
   geocoder.geocode({'address': address}, function(results, status) {
      console.log ('geoCoderPromise Callback() called', results)
      
      if (status == google.maps.GeocoderStatus.OK && results[0]) {
       resolve(results[0]);
      } else {
       reject(status);
      }
   })
 })
}

We use 'then' to chain up the Promise objects chronologically.

export export const fetchStores = () => {
  console.log('dispatch action - fetchStores')
   
  return dispatch => {
    return geoCoderPromise(address)
      .then(result => fetch(getStoreUrl(result)))
      .then(response => console.log(response.text()))
  }
}

Looking at the console log, the execution sequence is now correct. Firstly we ask Google what is the post code for Paddington, then we past 2021 as a parameter to our getStoreUrl to generate the url and then dispatch a fetch().



Coding JSX in Eclipse

Setup Eclipse for JSX


We use Eclipse in our dev environment because that's officially what our Demandware UX Studio supports. As we migrate our jQuery to React.js, coding JSX in Eclipse required a few setup.


Associate *.jsx to the JavaScript Editor as 'default' editor.


Associate the content type to the file type.


Bonus: Webpack

We use webpack to package our file and running Eclipse in the background to automatically upload our cartridge to our Demandware sandbox. In order for Eclipse to detect a change and trigger an upload automatically, there is one more setting we need to change - Refresh using native hooks or polling.

Coding in JSX - How to Comment and if-else statement

Coding in JSX

Dealing with JSX is almost vanilla-html-like but sometimes still a little tricky, so I documented the 2 that I have encountered while I was working with React and JSX.

How to Comment in JSX

I was trying to comment out some of the html inside my jsx and it looked weird and didn't render properly.

render() {
  return (
    <div>
      <div>Hello World<div>
      <!-- This doesn't work! 
        <div>Hidden note</div>
        -->
    <div>
  )
}

What I really needed is this. A 'java' style rather than 'html' style.

render() {
  return (
    <div>
      <div>Hello World<div>
        {/* My Comments...
          <div>Hidden note</div>
          */}>
    <div>
  )
}

if-else in JSX

render() {
  return (
    <div>
      <div>Hello World<div>
      { if true }
        <div>true</div>
      { else }
        <div>false</div>
    <div>
  )
}

A normal if else statement cannot be compiled. However, jsx accepts shorthand notation

render() {
  return (
    <div>
      <div>Hello World<div>
      { true ? <div>true</div> : <div>false</div> }
    <div>
  )
}

Getting Started with Git 101

Scope

I have used many SCM in the past. VSS, SVN, TFS, ClearCase, Mercurial etc... There are so many of them yet they are so similar, so that they are not even worthy for a spot in a resume.

However, git was a little more challenging to me as their structures and architectures are different. I have now used git for just over a year now, and put the followings together that covered what I believe is a good starting point to learn git commands.

Technical

Config

I use posh-git and found the default dark red color was a bit hard to read against a dark background color in Windows console. I changed them to yellow and magenta by updating the ~/.git/config.

[color]
    ui = true 
[color "status"]
    changed = magenta bold
    untracked = yellow bold

Settings

# Change a global setting
$ git config --global --edit

# Change editor to notepad globally
$ git config --global core.editor notepad

# Setup git diff / merge tool globally
# for example, if we are using p4merge as a diff / merge tool.
$ git config --global diff.tool p4merge
$ git config --global merge.tool p4merge

# Git merge generates unwanted .orig file
$ git config --global mergetool.keepBackup false

Basic Changes

# Check pending check-in status
$ git status

# Get latest files
$ git pull

# Change branch
$ git checkout <branchName>

# Add files for pending check in
$ git add <filename>

# Undo a git add
$ git reset <filename>

# Delete files for pending check in
$ git rm <filename>

# Undo pending delete files 
$ git reset head <filename>

# Amend last commit
$ git commit --amend

# Undo commit
# This will reset the the branch to a previous commit 
$ git reset HEAD~

# Hard reset is a potentially dangerous command
# Changes are destructive and may not be recovered
$ git reset --hard <commit-id>

# a force push will force the origin to point to the same commit as local
$ git push origin HEAD --force

# Discard changes in working directory
$ git checkout <filename>

# Discard untracked files in working directory
# Double check what to delete
$ git clean -f -n

# The actual deleting
$ git clean -f 

# Discard untracked folders in working directory
$ git clean -df 

Stash


It is similar to shelve in TFS.

# All unstaged and staged dirty files will be "stashed", 
# and the working directory will be cleaned.
$ git stash

# shows the list of stash
$ git stash list

# shows content of stash
$ git stash show -p

# retrieve then remove changes from the stash 
$ git stash pop

# apply changes (and not removing) from the stash
$ git stash apply

# remove changes from the stash
$ git stash drop

# remove all stash history
$ git stash clear

Branch

# Delete a local branch
$ git branch -d <branchName>

# Delete a remote branch
$ git push origin :<branchName>

# Rename current branch
$ git branch -m <newname>

Merge

# Merge a branch from source to destination
$ git checkout destination-branch
$ git merge source-branch

# Resolve a merge conflict
$ git mergetool

# Resolve merge conflict with theirs or ours preference during a conflicted state.
# Take their changes
$ git checkout --theirs *
$ git add *

# Take our changes
$ git checkout --ours *
$ git add *

Tag



# listing tags
$ git tag
  
# add tag
$ git tag -a <tagName> -m "A message for tagging"
  
# push local tags to remote
$ git push origin --tags

# branch out from a tag
$ git checkout -b <newBranchName> <fromTag>

Rebase

# Rebase branch from parent branch
$ git rebase <parentBranch>

# Conflict during rebase
$ git rebase --[abort|skip|continue]

Fork and Submodule

# Add a remote repo to current repo as a subfolder
$ git submodule add <gitRepo>
  
# Get latest in submodule
$ git submodule update

Demandware - Bazaarvoice Cartridge Decoupling by Using Tealium and Commerce Connect

Scope

We are implementing a Bazaarvoice integration with our Demandware ecommerce platform. For general Demandware customers, that require installing and configuring the Demandware Bazaarvoice Cartridge from the Demandware Marketplace.

For us, we are slightly ahead in our game plan and we can do something a bit more advance. We use Tealium for our tag managment and Commerce Connect for our feed integration. Therefore, we can put our Bazaarvoice beacons in Tealium, and our product feed from Commerce Connect. Then our Demandware implementation became a simpler implementation with just the review containers and the submission isml template.

Technical

In a nutshell, we are implementing a bunch of Bazaarvoice products and we distributed some of the responsibilities to other products depending on what is required.


The general idea of the above is to leave the html changes in Demandware, put all javascripts in Tealium, and create feed jobs that run from CommerceConnect.

SEO is implemented within the Demandware cartridge.

Product Catalog Feed is moved to Commerce Connect - The same platform that we use to manage our eBay or Google feed. We can setup a new channel with the Bazaarvoice type.


Question and Answer / Ratings and Review are split into both Demandware and Tealium. The bvapi.js tag will go to Tealium, but we need to implement the html containers or initialize the inline ratings in Demandware.

ROI Beacon is basically a javascript call to do $BV.SI.trackTransactionPageView(). This is achieved via Tealium.

For Submission form, this is a piece of stand alone Bazaarvoice component for customer to submit reviews, so I leave everything in Demandware including the javascript. It is implemented in the Demandware cartridge as Bazaarvoice-Container pipeline.

We also made some UI changes to include our company header in the container. Container URL is done via Config Hub.


Conclusion

There were a few hurdles during this process, but as our ecommerce system grow and integrate with many vendors, it is essential to setup all this foundation correctly.

By decoupling some of the job responsibilities to other vendors, our ecommerce system can focus on strategy and planning, while leveraging our vendors to help us managing our tags, product feeds or product reviews.

Demandware - How to replicate from PIG to SIG

Scope

Typical scenario where we want to bring down a staging instance from Primary Instance Group (PIG) to one of our sandbox so we have up-to-date contents in our Secondary Instance Group (SIG). By default, Demandware does not have this functionalities and cannot be done out of the box.

Solution

One easy way to achieve this is by doing this.

  1. Go to Business Manager in PIG
  2. Administration >  Site Development >  Site Import & Export
  3. Export site and Save in Global Export Directory
  4. Optionally run dbinit in SIG via Control Center
  5. Go to Business Manager in SIG
  6. Administration >  Site Development >  Site Import & Export
  7. In the Import panel, the site backup will be available as global location.