30.11.19

Coding and running


Coding is one of the things I have aspired to do since like...forever! But finding a resource in-sync with my comprehension, schedule and able to retain my interest long enough is a challenge. I have the attention span of a gnat so, I jumped everywhere! If I am not actively engaged with the learning, I just can't do it. And I know...we have DataCamp, Udemy, Khan Academy and even Kaggle...but I either can't keep up, too poor to pay for the full course or it couldn't sync with me enough. I believe I can say that most of the exercise doesn't 'vibe' with me.

Recently, I committed myself to my one passion; running. It's one of my favorite activities when I was back in school but the will to really run died a decade ago. I have recently picked up my running shoes and ran my little heart out despite having the speed of a running ant; aging perhaps? And I owe my hardcore will to the motivation of earning what I paid when I decided to join a 1-month long virtual run of 65km. It is called the 'Pave Your Path' virtual run organized by Running Station. Nailed it 2 days ago after 13 sessions of 5km - yes, you can accumulate the distance from multiple runs. It made me realize that...it's not that bad. The 'near-death' experience while running kinda turned me into a daredevil these days when it comes to undertaking some things I'd whine about doing a few months back.

"If I can go through dying every single evening for 5km long run...I can handle this,"

My thoughts exactly every time I feel so reluctant to finish some tasks I believe I could hold off for some time.

Naturally, I plan my work rigorously and despite the flexibility of my schedule and my detailed plans, I still have a hard time trying to nail the last coffin to my projects. Usually, it's due to my brain's exhaustion from overthinking or I am just truly tired physically. Which is a weird situation given I do not farm for a living. Even so, I was lethargic all the time.

But when I started running a month ago, things kind of fall into places for me. Maybe...just maybe...I've become more alert than I used to. I still have my ignorance of things that I believe do not concern my immediate attention but I seem to be able to network my thoughts faster than I used to.

It might be just me, feeling like a new person due to my sheer willpower to not burn my RM60 paid for the virtual run, but it did feel like there was a change.

For that, I managed to confirm what I have suspected all along - I am one of those people who love drills. I like things to be drilled into my head until I by-heart it into efficiency and then focus on polishing the effectiveness.

Thus...for coding, I committed myself to freeCodeCamp. By hook or by crook, I'll be coding by first quarter next year or someone's head is gonna roll!


It's an interactive learning experience simple enough for me to start, straightforward enough to not make me waste my time searching for answers and it's free. God bless Quincy Larson.


Going back to the program outlined in freeCodeCamp, I find it fascinating that they start off with HTML. I have no arguments there. My impatience made me learn my lesson - you run too fast, you're going to burn out painfully and drop dead before you halfway through. HTML is a very gentle introduction to coding for newbies since it's like LEGO building blocks where you arrange blocks and match two to create something. I didn't have to go crazy with frustration is I don't 'get' it. Yes, we would all want some Python lovin' and I think alot of coders I came to know have raved about how simple it is to learn. But I think, it is an opinion shared by 'experienced' coders who wished Python was there when they first started coding. Someone once told me, what you think is the best based on others' experiences may not be the best for you...and I agree with this. After alot of deliberations and patience at my end, starting over again this time feels, unlike the dreaded looming doom I've always had back then.


Are you into coding? What do you code and what's you're language preference? Where did you learn coding? Feel free to share with me!

Share:

26.11.19

Binning and mapping

My favorite cartographer is John M. Nelson. In fact, he's the one who actually got me searching what 'cartography' really is. Fortunately, he's a mix of a storyteller/technical support analyst/designer. So, his techniques are the ones I have least trouble understanding. And this is by no means a comment meant to offend because really, I'm a little slow and John is a very 'generous' teacher when it comes to explaining things; even through replies in posts. You can witness his work first hand at his own blog posts here; https://adventuresinmapping.com/.



So, the first of his work that captured my attention is the Six Month Drought of the American Southeast map created using the binning method. I didn't even know what binning is, but the map was so pretty it had me announcing my loyalty to #cartography hashtags. 


So what is binning? According to GIS Lounge, binning is a data modification technique where original data values is converted into a range of small intervals called bins. Bins will then be replaced with a values that is representative of that interval to reduce the number of data points. 

Okay. It should be a no-brainer. But the data he used was the polygon shapefiles of droughts' extent and their severity. Although it is still unknown to me how USGS actually collect this data but his map is sang the deserving anthem to their hard work. But alas, I never had the chance to reproduce it. I do not have the knack of identifying interesting data of any sort, so I either am stuck with reproducing a redundant work or waste my time in a wild goose chase for data; I'm a noob with a tunnel-vision focus. I won't even vote myself if we have a jungle excursion that requires mapping cause we'll be stuck longer than necessary. 

Even so, one year later, precisely this moment...I found a valid reason to attempt this. And it's all because I need to validate satellite imagery classification some colleagues made to show hot spots of global deforestation. I am not a remote sensing wizard, but vector data...now that I can work with. 

Using the same binning technique, I can summarize the steps as follows:


Merge all the data of deforestation variables 
Generate hexagonal tessellation 
Create the hexagon centroids 
Use 'Spatial Join' to sum up the weights of overlapping polygon features of the merged data and join it with the hexagonal centroids  
Then configure symbology 

Visualizing was a herculean effort for my brain. The map John made is a bivariate map. And compared to his data which has 2 numerical variables to enable that, mine only had one and it is the summation of the ranking weight I ensued on the deforestation variables. He merged all the shapefiles of weeks after weeks of drought severity readings. Me...I just manage this >>>




My first attempt was to just visualize the probability of the deforestation using the centroid point sizes.



That wasn't much of a success because visually, it doesn't actually appeal to my comprehension. It looks good when you zoom in closer because it gives off that newspaper print feel with that basemap. From this whole extent, it's not helpful.

So, after I tried to no avail to make it work with toggling the size and the colors, I found that instead of trying to make it look nice, I better opt on answering the questions posed by my colleague; could you identify the areas of high likeliness of prolonged deforestation? For that purpose, only hexagonal mesh would do the trick. So based on the 10 km sq size of their hexagons that depicts the areas of deforestation based on image classification, I used 'Spatial Join' too again and join the centroids back their predecessor hexagons to carry the binned values. 

Et voila!


The weight summation was of the degree of prolonged deforestation likeliness and the values range all the way to 24. I made 4 intervals which gave a practical visualization. Eight intervals were pushing it and 6 was not pleasant. It could be my color palette choice that made them unappealing but too many intervals will defeat my purpose. 

Yay or nay...I'm not too sure about it. But I do believe that this summarizes the areas where conservationists should be on the alert with. 

After having a discussion with a colleague, yeah...this technique has a lot of gaps. 

ONE; this is not a point feature. Using the values where the centroid touches/overlays ONLY is not exactly a precise method. Although, it is not wrong either.

TWO; The merged polygonal data came off as OVERLAPPING polygonal features. 

Overlooking the shortcomings and just using it to visually aid cross-checking...yea maybe. Even then...it's not as laser-point precise as one would aspire. I stand humbled. 



Share: