Thinking Beyond Beacons: Macy’s Shopkick Rollout

by

Macys

The Magic of Macy’s

Macy’s recently announced the largest deployment of beacon technology in retail, with more than 4,000 ShopKick devices planned for installation across all of its stores in the U.S.  While it is encouraging to see a retailer the size of Macy’s embracing proximity marketing, we think there is a lot more that Bluetooth can enable for brands.

Commercial platforms like ShopKick are limited in their ability to connect a customer experience to the brand. Instead, they generalize an experience across multiple brands by gamifying the in-store experience into a scavenger hunt for “kicks”– points that are awarded to users for performing various in-store tasks. In fact, many users have figured out how to exploit apps like ShopKick to get more points since the application will award “kicks” for scanning the barcode of certain products in-store (assuming that the user would have to walk through the store to find the item). Sites have popped up that let ShopKick users share barcodes so they can print them out, walk into a store, and then immediately scan the UPC codes without needing to find the product on the shelves. Yes, it gets customers in the store, but is it really optimizing dwell time and engagement?

Additionally, commercial apps that are separate from a brand’s app divorce the customer’s intent from a deeper interaction with a brand like Macy’s. Instead they replace potentially meaningful brand engagement  with another incentive around gathering points to cash in for other items. And those prizes include competitors to Macy’s, such as gift cards to Target, TJ Maxx, JC Penney, and other retailers. Conceivably I could walk around a Macy’s store gathering “kicks” and then trade those in for a gift card to a competitor without ever spending a dollar in Macy’s.

Furthermore, ShopKick diminishes a brand’s ability to tailor notifications, and risks flooding the shopper with a deluge of notifications with little thought given to targeting or contextualization. (How can they provide meaningful information if they don’t know your brand or your customers?) One of the biggest challenges with beacon deployments like this is user retention: consumers will opt-out if the notifications don’t provide value.

Our perspective is that technology should make sense and provide value for both the brand and the customer. Unfortunately, off the shelf deployments tend to fall short on delivering those experiences. Retailers should invest in technology that is cohesive with the brand and integrated with their loyalty programs. It should also enable more granular control over how, when, and why they engage their customers.

NYC BigApps Competition: Big Ups to Heat Seek

by

Heat SeekRecently, the folks at Heat Seek NYC presented their work at NY Tech Meetup, and the project– highly deserving of its many awards– is a great example of the future of tech and civic innovation in New York City.

Why? Because it works to improve city services in a tangible way, brings the Internet out of our “computer boxes”, and leverages the growing capabilities of small startups like Heat Seek to manufacture and build Internet-connected objects. And soon, New York City will become even more friendly to such projects.

Heat Seek focuses on an issue that is deeply in need of a data-driven approach: tenant heat complaints. Looking at the data, this is a common dispute in our city. While Heat Seek won’t completely eliminate these disputes, it can blanket an entire building with temperature sensors, making more honest building-wide measures possible. In fact, the system has value to all three stakeholders: tenants get data-driven, pre-filled city forms for heat complaints; landlords can optimize energy usage with visibility into where to insulate (a task for which grant programs also exist) and happy tenants; Heat Seek sells hardware.

Secondly, the project is more than a “USB dongle” one attaches to a computer. It’s an object that residents both young and old can see, understand and deploy. With it’s “hub and node” networking model, the nodes can be given out to less tech-savvy residents and require zero setup or configuration– just batteries. The hub (which has a more complex set-up) can be owned by a landlord or a nerdy neighbor.

As an entry for the NYC BigApps competition, the project stands out further given it engages the local manufacturing scene, one of the City’s stated economic development targets. Companies like Heat Seek, Tomorrow Lab and others like them working on “Internet of Things” devices, coupled with NYC-based online manufacturing platforms like Machine Made, can produce products at scale and distribute them to their eager customers. Might New York become the place to launch that new Internet-connected product?

Looking to the future, the City’s recent Payphone of The Future initiative– which will deliver public Wi-Fi access to every borough of New York City– might create a world where these devices can be shipped to connect to free City Wi-Fi by default, creating a zero-configuration, take-it-out-of-the-box-and-it-works experience for many of its customers in New York City.

Heat Seek is a clever answer to a thorny problem. Having created a device that sits at the confluence of a currently expensive, labor intensive process (311 apartment heat inspections), changing consumer technology and manufacturing trends, and engaging under a strategy that delivers real value to both landlords and tenants (skirting the conflict that usually relates those two), I think Heat Seek might become a new model for digital civic startups and citizen engagement.

This post was written in collaboration with Victoria Dower (who was super impressed by Heat Seek’s NYTM demo). 

What does Apple Pay mean for retailers?

by

Apple PayYesterday Apple announced a contactless payment technology in the iPhone 6 and iPhone 6 Plus called Apple Pay, which will allow consumers to buy products in-store with just a tap of their iPhone. Apple Pay leverages a combination of iPhone hardware including NFC technology, Touch ID, and Secure Element — a dedicated chip that stores encrypted payment information — to process payments and ensure the security and privacy of personal data. In fact, both Apple and the retailer won’t know what shoppers are buying because a one-time payment number and a dynamic security code will be used to complete the transaction.

This comes at a time when Chip-and-PIN systems are becoming standard security. The Chip-and-PIN system experienced a fragmented roll-out and low user adoption in the U.S. due to high hardware costs and slow credit/debit card transition to chips, but it has been widely adopted in Europe over the years. But now both strips and PINs are showing their insecure vulnerabilities. Recently, a video went viral that shows how easy it is to steal a PIN with cheap and easily accessible infrared cameras (that attach to iPhones no less!). 

In typical Apple fashion, Apple Pay puts consumer needs first, so anonymity of purchase data was touted as a major feature. With Apple’s understanding of refined user-interaction, the secure finger scan and NFC swipe is positively frictionless compared to typing in a PIN number, which is the security mechanism used by Google Wallet and other digital wallet platforms. And now with Apple leading the market into NFC acceptance, NFC reader penetration at point-of-sale should be much higher. It may also accelerate the killing off of retailer-specific apps by moving the “Uber model” of payment from a specific app into the Apple ecosystem. All of this may be the perfect salve for jittery consumers who are frightened after the Target and Home Depot data breaches.

But is also means that retailers will lose a tremendous amount of value and customer insight found in credit card purchase data. While the loss of transaction data may seem like a huge hit to retailers and their omni-channel efforts, Apple Pay does present some great opportunities for them:

  • Allows retailers to offload security concerns to the consumer (rather than holding a treasure trove of credit card numbers and names).
  • Provides opportunities for MUCH more flexible mobile POS options and pop-up shops.
  • The “no card present” fees that Apple negotiated are way lower than standard retailers were able to get from financial institutions. This really moves the needle for retailers by reducing a major hard cost.
  • Provides motivation for the rest of the digital wallet marketplace to follow Apple’s lead. This should eventually lead to greater and faster consumer adoption.
  • Retailers can still use a combination of loyalty programs and proximity sensors to respond to their customers.

The Making of Future Makers

by

UAMaker logomark jpeg

Control Group is a proud Founding Partner of a new high school, called the Urban Assembly Maker Academy, which opened last week in our Lower Manhattan neighborhood. UA Maker is a Career and Technical Education (CTE) school with a mission to empower students to be successful, adaptive citizens through challenge-based learning and the principles of design thinking.

UA Maker’s curriculum prepares students for both college and careers by teaching them how to use design thinking and technology to solve problems. The school features a new kind of classroom experience that models aspects of the modern agile workplace so that students can develop the skills, tools, and habits of inquiry to be tomorrow’s “makers.”

Control Group got involved with UA Maker Academy because we believe that the world’s challenges require problem solvers who are equipped with both critical and creative thinking skills. They will need to be curious about the world around them and empathize with others in order to develop the best solutions for people, communities, and businesses. Beyond a textbook education, the next generation of strategists, engineers, and designers deserve exposure and experiences in tackling real world problems.

In our business, we use principles of design thinking to create successful products and experiences for our clients. By leveraging a human-focused mindset, we have a clear path and method for collaborating with stakeholders to create the most impactful solutions. In collaborating with the Urban Assembly, and an energized group of industry and higher ed representatives, an amazing group of ambitious and talented educators are providing the students with an opportunity to approach their world with empathy, confidence, and action as the backbone of their high school experience. This is just what we need to build the future.

 

Data Freedom: Part 2 of 3

by

Sure, it’s not exactly like Scottish independence, but I feel like William Wallace might still give us the nod for our own effort at (data) freedom.

A few weeks ago we started looking at data freedom because, while there are many advantages to using SaaS vendors, there are some issues to keep an eye on. One of those issues is finding ways to access and use the data that’s been sent out into the vendor’s system. The first installment of this series was about a small problem with a fast solution. We didn’t have to worry about real-time or frequently-changing data.

But for Vendor 2, things weren’t so easy. Like well-known #2s Art Garfunkel and Ed McMahon, Vendor 2 is easy to overlook but nonetheless necessary on a day-to-day basis. Vendor 2 is one of those internal tracking vendors we use every day with data that changes quickly and often.

Vendor 2 got the job done for us, but sadly, their reporting left something to be desired. Sure, they had reports, but there was no way to link to external data. And don’t get me started on getting it to do any complicated slicing-and-dicing. We ended up with a lot of people who needed to pull down spreadsheets and re-do the same calculations month after month. We heard the cries from people-who-will-remain-nameless (but who are me):  “I can write the darn SQL if you just let me!”

So, how did we setup a system that uses SaaS vendor data but reports the way we want it to? We setup a system to copy their information to a database we control… and then we wrote the darn SQL.

Easier said than done, for sure. For this case, we called in bigger guns and took a look at Talend, a full Enterprise Service Bus (ESB) solution that enabled us to create our own data store. The goal was to create a data store on our own terms that can auto-update as information changes on the vendor’s side in near-real time via Vendor 2’s full-featured API. Now we can do what we need with the data: write the SQL for static reports or hook up a BI tool to view it. Whatever we need.

Just that easy? Well…

TalendScreen

That “easy”

In this case, we used the Community Edition of the ESB to see what it could do. One thing we found right away was that Talend organizes things two ways: “jobs” and “routes.” The routes side is based on something Enterprise Architecture veterans will know well as Apache Camel. Working with an agreed-upon standard has its own advantages, but we also found routes to be more robust than jobs. For instance, they had an ability to handle exceptions, such as the API responding slowly, or handling cases where we needed to “page through” long sets of data. With that, we were off and running with a few hurdles to hurdle.

Nice Flow Diagrams Do Not Mean Non-Technical: Starting with a “route”, we went data object-by-object to create a parallel data model on our side so we could write the SQL and map each to a specific API call. To the uninitiated, not-so-user-friendly Camel calls look like this:

.setHeader("RowCountInPage").groovy("headers.RowCountInPage = headers.RowCountInPage.isInteger() ? headers.RowCountInPage.toInteger() : 0")

Not exactly drag-and-drop syntax. That’s a fairly simple one, actually, but even still it’s using Camel along with groovy templating — and it can only be viewed or edited via a “property” of one of those flow icons, not in a text file. The GUI aspect falls away fairly quickly.

In short, this is a case that called for real development. It’s not rocket science but also not to be taken lightly. Don’t let the nice flow diagram fool you.

An API Is A Unique Blossom (sometimes): On the Vendor 2 side of things, they do have an API, but there were no quick answers here. You can do an awful lot with a full-featured API, but it might take a while to learn how to do it, as each API is a little different. In this case, each call required crafting a specific XML structure, with a unique manner of getting large data sets by page and sometimes opaque terminology. There was no easy “getProjects()” type of call to fall back on. We were able to work our way through Vendor 2’s documentation but it also made us appreciate a solution like we designed for Vendor 1, which allowed us to avoid that level of mucking about in somebody else’s data model.

And Here You Thought Things Like Version Control Were Straightforward: Just when you thought you had git mastered and thought it’d be easy to work in a team again, along comes a centralized system like this. As it turns out, a Talend workflow isn’t just based on a few nice and editable XML files. Instead it creates sets of files and auto-updates portions of them in unpredictable ways. For instance, the updated date is part of a project file, so every save “changes” the project. Be sure to close Talend before committing your files since they change when the Studio product is running!

Talend, the company, wants you to upgrade to the paid edition to have their internal version control, but that would also mean a separate repository specifically for their tool.  In the end, we got it to work in our version control and lived to tell the tale. Unfortunately there were bumps in the road in places we thought might roll like the autobahn.

In general, Talend worked for us, but using the Community Edition wasn’t always so straightforward. For instance, going with the “routes” side of Talend skewed from Talend’s main offering in favor of the more standard Camel implementation. Using routes meant we could leverage lots of Apache Camel documentation but it cut off all sorts of Talend’s own forums and documentation, which were focused on the “jobs” side. Alas, there wasn’t an easy middle-ground to utilize the positives of both sides.

In the end, Vendor 2 was a lot more work to integrate than Vendor 1. That’s no surprise.  But, now that we have it up and running, the volume of information we’re now capturing and updating is huge. Now that we have it implemented we can write those reports however we want: Business analytics packages, home-written darn-SQL statements, etc. And the Excel re-work won’t be necessary. We did this all without touching the main functions of Vendor 2.

We took on a lot more configuration work, but now find ourselves with a full backup of our data– able to do what we want with it, not what we can. This level of integration also makes us a little less dependent on Vendor 2. Should we need to swap them out someday, we will start with all our historical data completely at the ready.

After all, even Simon and Garfunkel eventually broke up.

Summer Internship 2014: Visualizing Workplace Data

by

This is a post by Samuel Lalrinhlua, a student at Syracuse University in the Master of Science in Information Management (2015) program. He was also a summer intern on our Enterprise Architecture team. 

I first came across Control Group when I read the ‘Best Places to Work 2012’ list published by Crain’s. I was immediately drawn in by the photo of their StarTrek-esque hallways and thought to myself “that would be a cool place to work”.  But I never thought in a million years that I would actually get an opportunity to work for this company and would be writing about my internship experience on their blog.

When I arrived in June I was given a detailed description of the projects that I would be working on this summer: add visualizations of CG data on the monitors that hang above the Support Center and find other interesting ways to show data around the office. My fellow intern, Soohyun Park, and I were asked to collaborate and create visualizations that used and displayed dynamic data.

Conf. Room Availability

I worked on several applications, such as Talend and PostgreSQL, to extract relevant internal data such as Personal Time Off (PTO) status, work anniversaries, timesheet usage and project status, among other things. All of this data was used in creating the visualizations that are now shown on the big screens in the office.  Many of these technologies were new to me and it took some troubleshooting along the way to see results. Soohyun and I also developed an iPad visualization that displays the status of the conference rooms. Red shows “booked” and green shows “available”. App development was new to me and I learned a lot from this experience.

I am glad that I got to spend my summer with CG. I have gained an invaluable experience, both professionally and personally. Thank you all for your support– and for the coffee (I’m going to miss that!). And thank you for making me a part of the Control Group team this summer

“Live long and prosper.”