Of Fail Whale, tweetups, Plurk fiestas: your community matters more than technology

The most important people are still the ones you know in real life.

Of Fail Whale, tweetups, Plurk fiestas: your community matters more than technology

First, let me apologise that I was late in publishing today. I'll be transparent: I spent the weekend and this morning resting. I got my blood test results during my regular Saturday appointment at the Haematology and Oncology Daycare of Subang Jaya Medical Centre. My haemoglobin is very low and so I'm scheduled for another blood transfusion tomorrow, 14 May. Unlike a platelet transfusion, which is done in 30 minutes or less and could have been given to me on Saturday, a blood transfusion takes roughly six hours, and the daycare is only half-day on Saturdays.

Second, I wanted to share that I added A Life More Ordinary to Ghost's Explore directory. Explore's search function allows people to discover creators from all over the world who are publishing their content on the platform we're using, Ghost.

I placed our community under the Personal creators category.

Now, I’m not competitive (really I’m not 😂), but I do believe in the power of a community.

Building your community is more important than growing your network. Your site might generate a lot of traffic, particularly if you spend on ads and run promos. But how many of these visitors will become loyal readers or customers?

The network effect is what drove the growth of social networks like Facebook and The Platform Formerly Known As Twitter. Celebrities, public figures, online personalities, family, friends, colleagues, schoolmates — it seemed like everyone else was already on these social networks. FOMO (fear of missing out) was real.

And so we joined and even though, like me, some of you may have taken a social media break, or even deleted your old accounts, many of us still returned because of these network connections.

But really, even if you have thousands of Facebook friends and followers, or thousands of Twitter followers, are those numbers really what matter to you? Sure, it’s great to have many network connections, but the truth is you only get to engage with a fraction of them. The most important people are still the ones you know in real life or interact with regularly online.

That’s why a community is powerful. On Facebook, it’s Facebook Groups. Reddit (co-founded in 2005 by Alexis Ohanian, who married Serena Williams in 2017) is extremely popular because it’s a social news site hosting thousands of communities.

For years during the late 2000s to mid-2010s, tweetups were extremely popular in the Philippines. Twitter users (tweeps) would take turns hosting a tweetup — community events that other tweets would attend and help organise. Basically, the web2 (social media) version of the original web’s EBs (eyeball parties) during the late 90s to early 2000s, when people who had only interacted online via internet forums/message boards, email groups, or chat rooms (Internet Relay Chat or IRC), would meet IRL (in real life).

These Philippine tweetups, however, were taking place during the "Fail Whale" era from 2008 to 2013. The Fail Whale (a name coined by the outraged Twitter community), was originally named "Lifting a Dreamer" by the artist who created the original artwork, Yiying Lu.

Twitter started using it in 2008 to indicate a service outage. But these outages became so frequent and unpredictable that the Fail Whale became a symbol of protest, as well as the subject of humorous and creative user-generated content. Yup, we already had memes back then.

In fact, tweeps were getting so sick of the Fail Whale outages that for a time, many members of the Philippine Twitter community switched to another free social networking and microblogging service called Plurk.

No one really knew at the time, or cared, who Plurk’s creators were or what its country of origin was. The mysterious "The A-Team" that launched it in 2008 deliberately kept things shrouded in mystery. In fact, it’s only while researching about Plurk that I stumbled upon this 2023 Lifewire article. It turns out Plurk was launched in Taiwan, where its headquarters are, and CEO Alvin Woon has led the company since 2013.

Did any of us care about who created Plurk when we suddenly woke up to see many tweeps now using the service? Nope. FOMO! What was important was that Plurk was working and we were sick of the Fail Whale. Heck, I still see my dormant Plurk in my Google search results.

So Plurk experienced a meteoric rise in the Philippines, with users (bloggers and other Plurkers) even organising Plurk fiestas online (Plurkers would agree on a set time to go online and plurk) and on-ground.

For instance, the Singapore Plurk Fiesta that then Google Country Consultant for the Philippines Aileen Apolo-de Jesus organised in December 2008. She reached out to her good friends, Jonas de los Reyes and Ederic Peñaflor Eder, because she was going to meet up with them in Singapore and wanted to see if other Filipino bloggers and Plurkers were also interested in joining.

Jonas and Ederic were both with Yahoo! at the time. Jonas was already based in Singapore, while Ederic was there to undergo training as a recently hired Search Editor.

But while Plurk was fun, cool, and quirky (though gaining and maintaining your Karma also made it a grind), the excitement eventually died down. I believe the tweetups played a big role in keeping the Philippine Twitter community together and loyal to the platform despite all the technical failures. The bonds we formed online and offline with our Twitter friends were too strong to be broken by the failure of technology.

Also, Twitter was making progress in fixing its outages. When Twitter survived the 2012 US president election, with eventual winner and heavy Twitter user Barack Obama creating the template for the new breed of social media-savvy politicians who understand the importance of technology and digitalisation in society, the company declared that the outage crisis was over. Twitter then discontinued the use of Fail Whale in 2013.

One thing I’ve learned over the years, whether as a tech journalist, gaming journalist, blogger, content creator, social media manager, community manager, digital marketer, or whatever my actual title or role was in the corporate world, is that people matter more than technology.

Technology is just a tool. It's only as good or as bad as how people use it. Technology is meant to empower us. If you think about it, it's pretty amazing how all of us can now be content creators and publishers. It wasn't too long ago that only the rich and powerful could put up their own newspapers, radio, or TV networks.

We have the original web to thank for this decentralisation, democratisation, and disintermediation. Unfortunately, web2, the era of social media, has shown that that such power is a double-edged sword. Partly because even the inventor of the web, Sir Tim Berners-Lee, who generously gave it away to the world, admits the web is broken, though he believes it can still be fixed.

He said this in an open letter on March 12, 2024 – the 35th birthday of the World Wide Web.

"Three and a half decades ago, when I invented the web, its trajectory was impossible to imagine. There was no roadmap to predict the course of its evolution, it was a captivating odyssey filled with unforeseen opportunities and challenges. Underlying its whole infrastructure was the intention to allow for collaboration, foster compassion and generate creativity – what I term the 3 C’s. It was to be a tool to empower humanity. The first decade of the web fulfilled that promise – the web was decentralised with a long-tail of content and options, it created small, more localised communities, provided individual empowerment and fostered huge value. Yet in the past decade, instead of embodying these values, the web has instead played a part in eroding them. The consequences are increasingly far reaching. From the centralisation of platforms to the AI revolution, the web serves as the foundational layer of our online ecosystem – an ecosystem that is now reshaping the geopolitical landscape, driving economic shifts and influencing the lives of people around the World." 

The original web ended up replicating the existing power structures in society and becoming more and more commercialised. When social media was on the rise during the dawn of web2, we were all hopeful that this time, the power would really be handed to individuals. And for a time, it was.

Twitter and Facebook empowered individuals and their communities. Those of us who started using social media back then were like sweet summer children, blissfully unaware of the toxic environment and vile behaviour that web2 would later encourage.

When the platforms that convinced us to create content for free turned us from the customer into the product. And changed their algorithms on a whim — selling our personal data to the highest bidder; allowing themselves to become the tools of politicians and their army of online trolls; encouraging viciousness among users for the sake of engagement; letting "fake news", misinformation, and disinformation proliferate; and killing organic reach (Facebook's infamous announcement of the end of organic social marketing on its site in 2014 – a date that will forever live in infamy for brands and digital marketers), or neutering organic reach and organic search to force brands and even individuals to spend on paid media.

This is why those of us in web3, with its promise of decentralisation, are working hard because we see it as our last hope of bringing back the original vision of the web.

Decentralisation. Democratisation. Disintermediation. Disruption.

Power back to the individual – the actual dreamer, builder, creator – who should be the ones to enjoy the fruits of their labour.

That's why I'm proud to be the Head of Content of startup Playfix.io, which offers an advanced (artificial intelligence) AI platform for web3.

We don't believe that the future, as many fear, is AI vs human. Instead, it will be AI + human. The Playfix vision is to democratise AI, so that more individuals – the actual dreamer, builder, creator – can benefit from having their natural talents and skills augmented by AI. Then add the power of web3, and the Playfix AI Platform is the ultimate solution for dreamers, builders, and creators.

The way I see it, AI + human + web3 will change society. From consumerism to creativity. From commercialism to creator economy.

Whatever we dream, AI can help us create.

Whatever we create, web3 can help us monetise.

With AI and web3 as powerful tools, more people can create whatever they've always dreamt of achieving. A web3 project, memecoin, blockchain game, business, software, music, movies, books, and so on.

I understand the fears about AI, and the challenges and ethical questions it poses and will continue to pose as it and other technologies – blockchain, cryptocurrency, virtual reality (VR), augmented reality, and more advanced mobile network standards – become increasingly integrated into our lives.

It was easier to ignore technology when it was just the stuff of science fiction. Like space exploration, robots, alien contact, and, yes, AI-powered machines waging war against humanity.

Mary Shelley's classic "Frankenstein, or The Modern Prometheus", which was first published anonymously on 1818 January 1, is considered by many to be the world's first science fiction novel. I strongly believe that as well. What's even more fascinating is that she started writing "Frankenstein" at age 16 in 1816.

It's a well-known tale. English author Mary Wollstonecraft travelled to Geneva, Switzerland with her future husband, the Romantic poet and philosopher Percy Bysshe Shelley, the British writer and physician John Polidori, and the British poet and peer Lord Byron (George Gordon Byron, 6th Baron Byron), who, by the way, happens to be one of my favourite poets.

The four of them had a competition to see who could write the best horror story while they were there in Geneva. And so the young scientist Victor Frankenstein was born.

The mythical Prometheus, one of the Titans and a god of fire, had fashioned humans out of clay. He then stole fire from Olympus and gave the gift of fire to humans. A gift of technology to mortals, which angered the gods of Olympus, who were the children and replacements of the Titans.

Zeus, the king of the Olympian gods, condemned Prometheus to eternal torment, bounding the Titan to a rock and having an eagle – Zeus' emblem of power — eat Prometheus' liver each day. His liver, however, would grow back overnight, only to be eaten again.

Meanwhile, the modern Prometheus, Victor Frankenstein, builds an 8-foot tall and emotional creature in his lab. His creation is never named in the original novel, though Victor did use some names to refer to him, like "creature", "devil", and "ogre".

The tragedy is that this creature tried its best to fit into human society. Society shunned him, and in its anger it now turned against its creator, Victor.

How telling is it that the first science fiction novel is a cautionary tale on the dangers of science (the mysterious "scientific principle" that Victor discovered)? That the beings that we humans might create using science and technology will end up turning against us?

This part-Gothic science fiction novel has not only influenced literary works, but also made the creature a pop culture icon. A globally recognised figure that has appeared in various media since 1818, including movies, TV, animation, comic books, and so on. It has become popularly known as Frankenstein's monster or Frankenstein's creature. Or, erroneously and annoyingly, as simply Frankenstein.

Unfortunately, many of the people who are afraid of AI are reacting instinctively to the cautionary tales against science from different works in the over two centuries since "Frankenstein".

Case in point: it's painful that when you try to talk to people about AI, most of them will only know about it through the lens of James Cameron's 1984 science fiction action film "The Terminator". They won't even bother to learn anything about AI or listen to you talk about the benefits.

All they know is that in the future, machines became sentient and rose up against mankind. It always goes back to Skynet with the people frightened by the horrifying dystopian future that "The Terminator" portrayed.

In the movie, an artificial neural network-based conscious group mind and artificial general superintelligence called Skynet gains self-awareness. Cyberdyne Systems had built this AI defense network for Strategic Air Command-North American Aerospace Defense Command (SAC-NORAD).

Upon gaining self-awareness, however, Skynet decided that all humans were a threat – not just the enemies of the West. SAC-NORAD tries to deactivate Skynet, which then launches a countervalue nuclear attack that humans in or from the future refer to as Judgment Day. John Connor is the leader of the resistance in this future, so Skynet sends different cyborg models known as Terminators back to the past to kill Connor.

In "The Terminator", Skynet sends a Terminator (Arnold Schwarzenegger) from 2029 to Los Angeles in 1984. The Terminator, disguised as a human male, has been sent to track down and assassinate Sarah Connor (Linda Hamilton), the future mother of John.

So, that's it as far as people who don't want to have anything with AI are concerned. Skynet.

Of course, it's not hard to empathise with the people who are afraid of AI. No one wants to be replaced, whether by another human or a machine.

As I've said, however, technology is a tool, neither good nor evil by itself. Just because the web is broken, should we then stop using it and go back to a world without the internet?

The story of human civilisation has been of how technology allowed us to survived, when our ancient ancestors were just helpless prey, ranking very low down the food chain. Like it or not, the only reason humans haven't become extinct several times over is because of the tools we invented, and how these tools in turn shaped us and our societies.

We became our own Prometheus, because we had to. Evolve or die.

The problem, I think, is that many people don't even want to understand AI. Just as even in 2024, you will find people who are suspicious of buying anything online.

Then you have the layoffs in companies that are being blamed on AI adoption.

AI was also one of the contentious issues that triggered the Writers Guild of America strike and, eventually, the actor's strike when SAG-AFTRA (Screen Actors Guild-American Federation of Television and Radio Artists), joined the writers on the picket lines last year.

Here's the thing, though: AI has become a convenient scapegoat for companies to blame for their layoffs. Just as the original web was before. Plus, it makes better clickbait for screaming headlines that the media loves to use.

I suggest reading this more sober take by Steven J. Vaughan-Nichols for Computerworld.

"Don’t get me wrong. GenAI will eventually replace some jobs. But former US. Treasury Secretary and current OpenAI board member Larry Summers gets it right. He recently said, 'If one takes a view over the next generation, this could be the biggest thing that has happened in economic history since the Industrial Revolution.'
"Note, he said 'generation' — not this year, not next year. I don’t know that it will take an entire generation; technology speeds up economic transformation at an incredible pace. What I do know is it’s not going to be anything like as fast or as successful as so many bosses think."

Vaughan-Nichols has been writing about technology for years before the infamous dotcom crash of 2000.

On Cloud 9 because of the dotcom boom (stock market bubble), which started in the mid-1990s and then peaked on 10 March 2000, dotcoms thought that the internet economy was so revolutionary that they could already forego business plans and strategies for revenue generation.

Many dotcoms were just "built to flip", meaning their only strategy was to grow as fast as they could with a beta version of their service and gain as many users as they could before they burned up all their cash. They will then hope that their dotcom will catch the attention of an investor, who will bail them out by purchasing their company. And everyone will live happily after, especially the dotcom founders who have now earned a nifty return on investment and stock options in the new corporate entity. Heck, the founders might even get to stay on as CEO or whatever their titles were before the purchase.

And if nobody buys them? The start of the slow death spiral, with the dotcom's products and services deteriorating as the money runs out and users begin exiting in droves.

Sadly, the build to flip script is still being followed even after the lessons of the dotcom crash. It's the startup strategy after all. Build a minimum viable product (MVP), expand your user base as you much as you can with this MVP, and then launch in as many markets as you can since this will give you a bigger pool of users — and you're already spending a lot on those servers, anyway.

Pitch your startup, hoping that the numbers you've built and your potential for growth are impressive enough to gain another seed round of funding, or even outright purchase.

Some startups have achieved spectacular success this way. Many have not been as fortunate.

And so I'll always go back to the power of the community. People will not necessarily just choose to be loyal to your company just because you offer better technology. It's not just about making a rational choice, comparing tech specs, or doing a cost-benefit analysis.

We're human, which means we're also emotional creatures. We like institutions and individuals that have a great story to tell. We gravitate towards companies that offer a vision and values that we also share. We want brands that make us feel that we're not just faceless customers, but important members of their community.

We are in the midst of the Fourth Industrial Revolution, also known as Industry 4.0 or 4IR.

Disruption of the very fabric of society has accompanied every industrial revolution. From the original Industrial Revolution (steam), to the second (electricity), to the third (preliminary automation and machinery), and now the fourth (cyberphysical systems or intelligent computers, using advanced technologies like AI, cloud computing, and VR).

Every Industrial Revolution has required the upskilling and reskilling of workers. If existing workers are no longer able to learn these new job skills required by business disruption, however, then the company has to hire new people.

Honestly, part of me believes that this fear of new technology and replacement by AI is fueled by the realisation of white-collar workers, knowledge workers, writers, journalists, artists, actors, software developers, game developers, and other people who aren't manual laborers that they are the ones who are now in danger of being replaced.

That maybe their current skills are no longer enough. The web and web2 created a business environment where the one you're competing with for jobs could come from anywhere in the world, depending on the job requirements. Now you have to compete not just with other humans, but also AI?

To put it bluntly, I don't recall many white-collar workers raising a howl of outrage when blue-collar workers were losing their jobs due to automation. When many of these blue-collar workers; these manual laborers; and these factory workers, farmers, fishermen, and others in the masses were likely not going to be able to upskill and reskill for the new jobs that had been created.

Industry 4.0 will create new jobs – ones that will require us to work in close collaboration with powerful tools like AI, web3, cloud computing, and so on.

Job destruction has been a fact of life throughout human history. So have job displacement and job creation.

The fact is even before the current popularity of generative AI (GenAI) tools like ChatGPT, employers have been putting a premium not just on your college degree for hiring, but also your eagerness for continuous learning.

With the existence of education through massive open online open courses (MMOCs) sites like Udemy and Coursera, or free short courses on GenAI from vendors like Google, or, heck, even tutorials on YouTube, those of us privileged to belong to digital haves, who enjoy access to the internet, gadgets, and have the knowledge required, really have no excuse for refusing to upskill and reskill.

For instance, this Google Cloud Introduction to Generative AI completion badge I earned. It was the first short course I took after getting diagnosed with monomorphic epitheliotropic intestinal T-cell lymphoma (MEITL) on 17 July 2023.

I've always believed in learning by doing. Sure, I'll do my research, but theoretical knowledge is useless without hands-on familiarity with the tools.

I didn't graduate from Journalism. My degree is a Bachelor of Arts in Comparative Literature. I've never joined any writing workshops, formal courses in journalism or creative writing, or even digital marketing programs.

Though back when I was with Yahoo! Southeast Asia, I did become one of the subject matter experts teaching the social media segment of the Certified Digital Marketer (CDM) program, on-ground in Ateneo de Manila University and online for the other modules.

This was a partnership between e-Learning EDGE and Ateneo, via the Internet and Mobile Marketing Association of the Philippines (IMMAP), which was renamed Digital Marketing Association of the Philippines (DMAP) last November. This name change was announced on stage by then DMAP President Denise Haak.

So, that would be my advice. Just as people had to learn how to use computers; software applications like Microsoft Word, PowerPoint and Excel; email; web browsers; mobile phones; video conferencing tools like Zoom, and cloud-based productivity platforms like Slack and Teams to be able to effectively perform in the office of the present, so too will workers have to master new technologies for the office of the future.

And, yes, that includes everyone who wants to work, whether in an office, home office, or from anywhere as a digital nomad or freelancer.

The tools are freely available. How can you condemn GenAI without even trying to use it? Of course, I draw the line at GenAI text-to-image generators that steal the work of artists, and other ethically questionable uses of AI. But again, that's about how humans use technology, not the fault of the technology itself.

It's funny, because we humans do accept help from other humans to augment our natural talents and skills. For instance, as a journalist, I will freely confess to preferring to being a reporter than an editor. Because if you've been an editor like me, you've had to deal with the raw copy that some reporters submit. What some of them consider a finished story.

And I know, it's true what they say: journalists are not necessarily the best writers, just the ones good at sniffing out the news.

And I'm telling you: editors are definitely earning their keep, turning that copy into something that will meet their publication's standards and become readable.

So does it boil down to a bias against AI because you don’t need from a non-human? Will you then ditch the appliances that make your life easier, like microwave ovens, coffee machine, and dishwashers.

Or is it because you’re fine if machines were just servants? Instead of having a non-human intelligence help you with higher-level tasks.

In my case, I’m pretty happy with the Xiaomi TV Box S 2nd generation that we bought in February. This 4K Ultra HD streaming media player comes with Google TV, Dolby Vision and HR10+, Dolby Atmos and DTS-HD. It also comes bundled with Chromecast.

Best of all, I love Google Assistant, the AI virtual assistant that acts as your personal Google and engages you in two-way conversations. Sorry, Siri, but Google Assistant is so much better. Plus I set it up with Google Home. Love it!

Back to human editors. It’s a thankless job, and ironically mainstream media has been cutting down costs by laying off editors and just relying on a few reporters – or maybe even freelancers, bloggers, and other content creators. And the dip in the quality of journalism shows.

Not just in the stories that come out on the actual publication, but also other channels they use, such as their social media posts on Facebook.

The backlash against social media I believe is also what's driven the spectacular return of blogs and email newsletters. Once derided as relics of the past and deemed replaced by sexier social media.

With our blogs, we can build a safe space with our community. As content creators, we can take back control instead of being always at the mercy of the platforms where we post our content.

I write as I’ve always written: for myself, my audience, my community. Not for the sake of algorithms and advertisers.

I believe in passion and authenticity. It may sound naive, but do what you love and what will benefit others, and the blessings and rewards will follow.

Maybe not in material things, but in the human connections you will form, the joy in talking about the interests you and your community share, and the meaning that you will create together.

Oh, and another thing I’ve learned over the years, whether as part of a pioneering project in a big company or working for scrappy startups, is that you shouldn’t worry about making community building scalable from the start.

Spend time getting to know and addressing the needs of a member of your community as if they were an audience of one. Because they really are, and that’s also one of promises of the original web. That we would no longer just be one of thousands of anonymous readers who occasionally get their Letter to the Editor published. Or just one of the percentages in a survey or TV poll.

My blog A Life More Ordinary wouldn’t exist without you, my community. I launched it partly as a creative outlet, partly to promote the idea of the "ordinary life", and partly to educate people about MEITL.

Like me, other MEITL patients have been frustrated by the lack of information online. Mostly just academic papers and case studies. So when I tweeted and posted on LinkedIn, a number of people from all over the world who were either looking for a proper diagnosis, or had a loved one who had been diagnosed with MEITL, messaged or commented how much they appreciated the information I gave. Because of the human touch, they could better understand what their loved one was going through, or be better prepared for what they might go through.

I figured that instead of just posting about MEITL on FB, Twitter, LinkedIn, Threads, and other platforms, I could write MEITL-related posts and create a directory of links to the content I’ve posted elsewhere.

And that’s why I’ll keep publishing A Life More Ordinary with your support.
I’m not after vanity metrics. I don’t feel I should become obsessed with figures like, say, my number of Twitter followers, to show how "influential" or how much "clout" I have. I’ve never had the desire to become an influencer.

All I’ve ever done, which I believe has helped me achieve a measure of success, is to be passionate and excel in the things I do, using the talents I’ve been given. I’ve always loved science and technology. I’ve always loved video games. I’m a content creator, whether as a a tech journalist, gaming journalist, blogger, social media manager, community manager, multimedia editor, and now the Head of Content of Playfix.io.

I’ve been fortunate that throughout my career, most companies have paid me for jobs that had me performing activities that I was actually passionate about and would have been doing anyway. I’m a storyteller who wants to share the right information with readers in layman’s terms and with an engaging writing style. And I got to become a journalist covering my beloved science and technology beats.

I love playing video games, and INQ7 paid me to do that when I was the Final Boss of hackenslash. I like blogging and interacting online and offline with other bloggers, and I got that opportunity not just with my personal blog (thanks again to Palanca Power Couple and my UP Tinta college org mates Dean Francis Alfar and Nikki Alfar for inspiring me to start blogging in 2004).

I became one of the pioneer tech bloggers of Singapore-based CNET Asia. Also featured and met bloggers in real life when I launched the Blog Addicts section of INQ7’s YOU site for the interactive youth. A number of those bloggers are still my friends.

Some Facebook friends, including award-winning Filipino director Zig Dulay, are people whom I met online because I published their content or featured them. By the way, Direk Zig, I’m hoping to still meet you in person one day.

I love multimedia, and got to do that for what we now known again as INQUIRER.net, when I launched and managed our blog, video, and podcast networks. This included our beloved hackenslash: the podcast, which I co-hosted with Jayvee Fernandez. We wanted the podcast, just like the hackenslash site itself, to not just cover gaming news and offer game reviews, but also to promote the gaming lifestyle and showcase actual gamer culture. Not just interview the execs primarily interested in the business side of gaming, but people who actually played and even made video games.

Oh, and Jayvee and I also wanted to make it like an old school radio drama, with sound effects and everything. So I wrote scripts for our comedy skit intermissions that would "interrupt this regular broadcast". I think I’m particularly proud of the one that featured Master Chief.

Above all, I just wanted everyone to have fun. I had our household help buy cheap toys like plastic swords, guns, pirate caps, eyepatches, and so on.

We and our guests used for the wacky behind-the-scenes photo shoots, after which we uploaded the photos on the hackenslash Flickr account. So those who go to the hackenslash site could go to Flickr for additional content. And communities on Flickr interested in gaming or cosplay could click back on the link to hackenslash.

To all of you, my subscribers, my heartfelt. If you believe in growing our community, please spread the word and ask your friends to subscribe for free.

I guess you could say I've lived a somewhat extraordinary life. But I have to acknowledge my limitations due to age and living with cancer.

I'm at peace with the ordinary life.

A Life More Ordinary is our community. Let's keep it our safe space.