First post! Welcome to “Hadoop Hamburgers”, where I plan to write some posts about Hadoop and other topics that seem interesting. My first one is not related to Hadoop, but instead related to DNS, a subject near and dear to the heart of my employer, Verisign. Everything in getting this site setup went fairly smoothly, including updating my registrar’s DNS records to point my domain name at my hosting provider. Being an impatient sort, I didn’t want to have to wait for the TTL on my domain name to expire, so I ran a
dig request to see if my registrar had pushed through the change:
shell$ dig grepalex.com ;; ANSWER SECTION: grepalex.com. 3600 IN A 22.214.171.124
Indeed they had! Next up was trying to hit my website from my browser. When I did that however, Chrome was showing the my registrar’s advertising content. A few pokes around led me to Chrome’s web page which lets you invalidate its DNS cache:
However even after invalidating Chrome’s cache it still showed the content from the registrar. The cool thing about Chrome’s internal page is that it actually shows you the cached IP address, which indeed was still the old value. Clearly the OSX DNS client was performing some additional caching. After some more digging around I found the (Mountain) Lion-specific command which did indeed successfully clean OSX’s cache:
shell$ sudo killall -HUP mDNSResponder
About the author
Alex Holmes is a senior software engineer with over 15 years of experience developing large scale distributed Java systems. Since 2008 he has gained expertise in using Hadoop to solve Big Data problems across a number of projects. He is the author of Hadoop in Practice, a book published by Manning Publications. He has presented at JavaOne and Jazoon.
RECENT BLOG POSTS
Big data anti-patterns presentation
Details on the presentation I have at JavaOne in 2015 on big data antipatterns.
Understanding how Parquet integrates with Avro, Thrift and Protocol Buffers
Parquet offers integration with a number of object models, and this post shows how Parquet supports various object models.
Using Oozie 4.4.0 with Hadoop 2.2
Patching Oozie's build so that you can create a package targetting Hadoop 2.2.0.
Hadoop in Practice, Second Edition
A sneak peek at what's coming in the second edition of my book.
Using Hadoop 2.2 as a sink in Flume 1.4
Working around the protobuf 2.5 dependency introduced by Hadoop 2.2.