About LinuxChix Live

LinuxChix Live is a collection of weblog entries by members of LinuxChix.

LinuxChix Live is automatically generated from the RSS/Atom feeds of contributor's weblogs and includes personal, political and technical writing as they choose. All entries remain the copyright of the individual contributors.

If you would like your entries included, please contact us and tell us the URL of your RSS or Atom feed. Please use the Feed Validator to check your feed before sending it in.

Winking Microview

Terri - 6 hours 23 min ago
This is crossposted from Curiousity.ca, my personal maker blog. If you want to link to this post, please use the original link since the formatting there is usually better.

With my travel and work schedules, I haven’t had time to hack my original MicroView, but the replacement ones arrived while I was out at ABQ Mini Maker Faire! So of course, I had to try *something* now that I can actually flash things to it.


Here’s my current very simple program: a smile with a wink!


microview_wink


Although it’s probably better with video



And of course, it’s more fun if you can also check out the code so I dumped it into my git repository. Here it is in case you’re not feeling like clicking through:



/* * microview_wink: a simple winking face animation for the MicroView * * Created by: Terri Oda

comment count unavailable comments

Fossetcon 2014

Elizabeth Krumbach - 12 hours 49 min ago

As I wrote in my last post I attended Fossetcon this past weekend. The core of the event kicked off on Friday with a keynote by Iris Gardner on how Diversity Creates Innovation and the work that the CODE2040 organization is doing to help talented minorities succeed in technology. I first heard about this organization back in 2013 at OSCON, so it was great to hear more about their recent successes with their summer Fellows Program. It was also great to hear that their criteria for talent not only included coding skills, but also sought out a passion for engineering and leadership skills.

After a break, I went to see PJ Hagerty give his talk, Meetup Groups: Act Locally – Think Globally. I’ve been running open source related groups for over a decade, so I’ve been in this space for quite a long time and was hoping to get some new tips, PJ didn’t disappoint! He led off with the need to break out of the small “pizza and a presentation by a regular” grind, which is indeed important to growing a group and making people show up. Some of his suggestions for doing this included:

  • Seek out students to attend and participate in the group, they can be some of your most motivated attendees and will bring friends
  • Seek out experienced programmers (and technologists) not necessarily in your specific field to give more agnostic talks about general programming/tech practices
  • Do cross-technology meetups – a PHP and Ruby night! Maybe Linux and BSD?
  • Bring in guest speakers from out of town (if they’re close enough, many will come for the price of gas and/or train/bus ticket – I would!)
  • Send members to regional conferences… or run your own conference
  • Get kids involved
  • Host an OpenHack event

I’ll have to see what my co-conspiratorsorganizers at some local groups think of these ideas, it certainly would be fun to spice up some of the groups I regularly attend.

From there I went to MySQL Server Performance Tuning 101 by Ligaya Turmelle. Her talk centered around the fact that MySQL tuning is not simple, but went through a variety of mechanisms to tune it in different ways for specific cases you may run into. Perhaps most useful to me were her tips for gathering usage statistics from MySQL, I was unfamiliar with many of the metrics she pulled out. Very cool stuff.

After lunch and some booth duty, I headed over to Crash Course in Open Source Cloud Computing presented by Mark Hinkle. Now, I work on OpenStack (referred to as the “Boy Band” of cloud infrastructures in the talk – hah!), so my view of the cloud world is certainly influenced by that perspective. It was great to see a whirlwind tour of other and related technologies in the open source ecosystem.

The closing keynote for the day was by Deb Nicholson, Style or substance? Free Software is Totally the 80′s. She gave a bit of a history of free software and speculated as to whether our movement would be characterized by a shallow portrayal of “unconferences and penguin swag” (like 80s neon clothes and extravagance) or how free software communities are changing the world (like groups in the 80s who were really seeking social change or the fall of the Berlin wall). Her hope is that by stepping back and taking a look at our community that perhaps we could shape how our movement is remembered and focus on what is important to our future.

Saturday I had more booth duty with my colleague Yolanda Robla who came in from Spain to do a talk on Continuous integration automation. We were joined by another colleague from HP, Mark Atwood, who dropped by the conference for his talk How to Get One of These Awesome Open Source Jobs – one of my favorites.

The opening keynote on Saturday was Considering the Future of Copyleft by Bradley Kuhn. I always enjoy going to his talks because I’m considerably more optimistic about the health and future of free software, so his strong copyleft stance makes me stop and consider where I truly stand and what that means. He worries that an ecosystem of permissive licenses (like Apache, MIT, BSD) will lead to companies doing the least possible for free software and keeping all their secret sauces secret, diluting the ecosystem and making it less valuable for future consumers of free software since they’ll need the proprietary components. I’m more hopeful than that, particularly as I see real free software folks starting to get jobs in major companies and staying true to their free software roots. Indeed, these days I spend a vast majority of my time working on Apache-licensed software for a large company who pays me to do the work. Slides from his talk are here, I highly recommend having a browse: http://ebb.org/bkuhn/talks/FOSSETCON-2014/copyleft-future.html

After some more boothing, I headed over to Apache Mesos and Aurora, An Operating System For The Datacenter by David Lester. Again, being on the OpenStack bandwagon these past few years I haven’t had a lot of time to explore the ecosystem elsewhere, and I learned that this is some pretty cool stuff! Lester works for Twitter and talked some about how Twitter and other companies in the community are using both the Mesos and Aurora tools to build their efficient, fault tolerant datacenters and how it’s lead to impressive improvements in the reliability of their infrastructures. He also did a really great job explaining the concepts of both, hooray for diagrams. I kind of want to play with them now.

Introduction to The ELK Stack: Elasticsearch, Logstash & Kibana by Aaron Mildenstein was my next stop. We run an ELK stack in the OpenStack Infrastructure, but I’ve not been very involved in the management of that, instead focusing on how we’re using it in elastic-recheck so I hoped this talk would fill in some of the fundamentals for me. It did do that so I was happy with that, but I have to admit that I was pretty disappointed to see demos of plugins that required a paid license.

As the day wound down, I finally had my talk: Code Review for Systems Administrators.


Code Review for Sysadmins talk, thanks to Yolanda Robla for taking the photo

I love giving this talk. I’m really proud of the infrastructure that has been built for OpenStack and it’s one that I’m happy and excited to work with every day – in part because we do things through code review. Even better, my excitement during this presentation seemed contagious, with an audience that seemed really engaged with the topic and impressed. Huge thanks to everyone who came and particularly to those who asked questions and took time to chat with me after. Slides from my talk are available here: fossetcon-code-review-for-sysadmins/

And then we were at the end! The conference wrapped up with a closing keynote on Open Source Is More than Code by Jordan Sissel. I really loved this talk. I’ve known for some time that the logstash community was one of the friendlier ones, with their mantra of “If a newbie has a bad time, it’s a bug.” This talk dove further into that ethos in their community and how it’s impacted how members of the project handle unhappy users. He also talked about improvements made to documentation (both inline in code and formal documentation) and how they’ve tried to “break away from text” some and put more human interaction in their community so people don’t feel so isolated and dehumanized by a text only environment (though I do find this is where I’m personally most comfortable, not everyone feels that way). I hope more projects will look to the logstash community as a good example of how we all can do better, I know I have some work to do when it comes to support.

Thanks again to conference staff for making this event such a fun one, particularly as it was their first year!

Categories: LinuxChix bloggers

Ubuntu at Fossetcon 2014

Elizabeth Krumbach - Tue, 2014-09-16 17:01

Last week I flew out to the east coast to attend the very first Fossetcon. The conference was on the smaller side, but I had a wonderful time meeting up with some old friends, meeting some new Ubuntu enthusiasts and finally meeting some folks I’ve only communicated with online. The room layout took some getting used to, but the conference staff was quick to put up signs and directing conference attendees in the right direction and in general leading to a pretty smooth conference experience.

On Thursday the conference hosted a “day zero” that had training and an Ubucon. I attended the Ubucon all day, which kicked off with Michael Hall doing an introduction to the Ubuntu on Phones ecosystem, including Mir, Unity8 and the Telephony features that needed to be added to support phones (voice calling, SMS/MMs, Cell data, SIM card management). He also talked about the improved developer portal with more resources aimed at app developers, including the Ubuntu SDK and simplified packaging with click packages.

He also addressed the concern of many about whether Ubuntu could break into the smartphone market at this point, arguing that it’s a rapidly developing and changing market, with every current market leader only having been there for a handful of years, and that new ideas need need to play to win. Canonical feels that convergence between phone and desktop/laptop gives Ubuntu a unique selling point and that users will like it because of intuitive design with lots of swiping and scrolling actions, gives apps the most screen space possible. It was interesting to hear that partners/OEMs can offer operator differentiation as a layer without fragmenting the actual operating system (something that Android struggles with), leaving the core operating system independently maintained.

This was followed up by a more hands on session on Creating your first Ubuntu SDK Application. Attendees downloaded the Ubuntu SDK and Michael walked through the creation of a demo app, using the App Dev School Workshop: Write your first app document.

After lunch, Nicholas Skaggs and I gave a presentation on 10 ways to get involved with Ubuntu today. I had given a “5 ways” talk earlier this year at the SCaLE in Los Angeles, so it was fun to do a longer one with a co-speaker and have his five items added in, along with some other general tips for getting involved with the community. I really love giving this talk, the feedback from attendees throughout the rest of the conference was overwhelmingly positive, and I hope to get some follow-up emails from some new contributors looking to get started. Slides from our presentation are available as pdf here: contributingtoubuntu-fossetcon-2014.pdf


Ubuntu panel, thanks to Chris Crisafulli for the photo

The day wrapped up with an Ubuntu Q&A Panel, which had Michael Hall and Nicholas Skaggs from the Community team at Canonical, Aaron Honeycutt of Kubuntu and myself. Our quartet fielded questions from moderator Alexis Santos of Binpress and the audience, on everything from the Ubuntu phone to challenges of working with such a large community. I ended up drawing from my experience with the Xubuntu community a lot in the panel, especially as we drilled down into discussing how much success we’ve had coordinating the work of the flavors with the rest of Ubuntu.

The next couple days brought Fossetcon proper, with I’ll write about later. The Ubuntu fun continued though! I was able to give away 4 copies of The Official Ubuntu Book, 8th Edition which I signed, and got José Antonio Rey to sign as well since he had joined us for the conference from Peru.

José ended up doing a talk on Automating your service with Juju during the conference, and Michael Hall had the opportunity to a talk on Convergence and the Future of App Development on Ubuntu. The Ubuntu booth also looked great and was one of the most popular of the conference.

I really had a blast talking to Ubuntu community members from Florida, they’re a great and passionate crowd.

Categories: LinuxChix bloggers

F Yeah!

Noirin Plunkett - Tue, 2014-09-16 14:46

TL;DR: Give The Ada Initiative $128 today (or $10/month), and you’ll get the coolest sticker I’ve seen in ages!

 Feminism

For a long time, I was afraid to use the F-word. I studied CS, linguistics & German at university, and got into open source as a user and documentation contributor, more than a decade ago. I pretty quickly moved on to more community- and event-organisational things in my hobby/open source work, while continuing as a tech writer in my professional work (which has been a mix of open source and not).

And throughout my education, and my early open source days, I really tried to be “one of the guys”. It was hard, but I got good at it. I was elected to the board of the Apache Software Foundation, and appointed Executive Vice President of the Foundation. I remain the only woman to have achieved either of those distinctions. And when I did, I was told–on a mailing list!–that whether or not I was comfortable with the choice of imagery, I had big balls.

By then, though, I had realised that something was broken. Why should I need to have balls, whatever their size, metaphorical or literal, to do something I loved and was good at? Why should I have to adapt myself to fit an environment that was built by and for people who weren’t like me?

The Ada Initiative was still just an idea, without a real shape, or even a name, at that time. But as that shape emerged, from the two awesome women who founded it, through the group of role models and inspirations who advise and continue to run it, to the innumerable supporters who make it possible, it has made real and lasting change. It has provided a welcoming environment of its own, and has given many other groups the tools they needed to adapt the environments they had built, to be inclusive of people who weren’t like them.

Right now, I’m working–and being paid!–to help organize AdaCamp Berlin. This is literally a dream come true for me: volunteer work is amazing, but it has to be singing in a choir or it very quickly leads to burnout. Seeing the huge demand for AdaCamp Berlin, as well as the interest from other folk in running similar events in the future, is incredibly rewarding. And being paid for the value I bring is an important act of feminism.

The Ada Initiative is doing amazing work, but it needs individual donations, not just ethical corporate sponsors, to keep it going. Donate today to help us run more AdaCamps and Ally Skills Workshops, and develop more programs like our Anti-Harassment Policy and our Impostor Syndrome Training.

And, if you give $128 (or $10/month), you can get an awesome sticker pack, including the F-Word sticker designed just for this campaign! Stick it to your laptop, or to the man–it’s up to you :-)

Categories: LinuxChix bloggers

Why “Charge Less” is Almost Always the Wrong Advice

Erica Douglass - Mon, 2014-09-15 19:47

When I decided to become a coach in July, I sought out several experienced coaches for advice. After all, if I was going to make this my new business, I wanted to know what to do–and what not to do–to make my business successful!

One coach I spoke to advised me, “Charge less so you can build up a client base.” At the time, I had an almost visceral reaction to it. I replied, “No, I’m going to charge more.” She looked at me with surprise.

“Charge less so you can build up a client base.” It’s seductively simple advice that, no matter what industry you’re in, you’ve probably heard someone say. It’s one of those pieces of advice that seems so right. It seems like a good idea–but, in reality, many businesses fail by following this type of advice.

Why is that?

Why “Charge Less” Is Broken Advice

“Charge less” forces you to build a larger client base to support yourself. When you are just starting out with any new business, your goal is simple: Get one paying client. With all of my businesses, I was able to do that in the first 30-45 days I’d built the business.

The biggest challenge is getting that one client to pay you. It’s not how much they pay you. It’s that they pay you at all!

You might think, “Yes, but we don’t have the features/product/service that others have, so we can’t charge more.” This is a falsehood too. It’s a fear-based mentality that will bring you the wrong clients. Do you really want the client who would go with the better product (in his/her mind), but can’t afford it, so they go with you? Of course not!

Why People Actually Buy (Hint: Not Features)

Clients don’t buy a product entirely based on features. I know it sounds strange to hear that, especially if you’re in the startup/software industry, where pricing tables with bullet points are the norm. But it’s been true with every company I’ve built, from web hosting to software to consulting/services to my coworking space.

They buy the product because they believe it will help their business in some way. They buy a product from a startup because they believe in you. They are basically your early investors. They are placing a bet on you.

When I sold web hosting, I hired a coach/consultant who came out to my office for 2 straight days. We dove deep into my business. We found my most profitable product was a web hosting package I sold at 50% off our regular price. “Aha!” I can hear you thinking right now. “See, your web hosting clients bought on price.”

No. Turns out, they didn’t buy it because it was 50% off. They bought it because the way I pitched it was an opportunity to give feedback directly to us about what they wanted to see in the product. I said there would be a mandatory survey in the first 30 days where they could give feedback.

Then I got crazy busy with new customers and decided the survey wasn’t that important anyway, so I didn’t send one.

Do you know what happened next? People emailed me directly (and emailed our support desk) asking where their survey was. Some just wanted to make sure they hadn’t missed it because they were happy with their web hosting package and didn’t want it to be taken away from them because they didn’t fill out the survey. But an even larger chunk were hungry to give me feedback.

How Much Can You Charge? (Perhaps More than You Think…)

In talking to my clients, I discovered their biggest issue with the hosting industry wasn’t price. It was that, in the race to the bottom, customer service in the industry had become shockingly bad.

Here is the best part: Do you know what price I charged for that web hosting package? In a world of $10/month packages, my half-price offer was $40/month. The regular price? $80/month. For nothing fancy–just a shared hosting package with extra disk space. And the ability to give feedback–which found a “sweet spot” in a world of people who were frustrated with those $10/month hosting packages and the robots who seemed to be in charge of customer support at those companies.

I charged four times the going rate for hosting. What would have happened if I would have taken the advice of “Charge less so you can build up a customer base?” I would have failed! How many annoying, frustrating $5/month clients would it have taken for me to build a full-time income? Thousands.

I took that lesson and what did I do? I charged more! By the time I sold my company, we were into high-end dedicated servers and colocation, and our ARPU (average revenue per user) was $425/month. The clients who were writing us $8,000/month checks turned out to be just as awesome as the $80/month customers!

How Many Clients Do You Need to Pay Your Bills?

How many $8,000/month clients does it take for you to make a full-time income? Probably, even with expenses included, one. Perhaps two.

What do those customers look like? Well, they’re probably running successful businesses. They’re making good money. But they have real and significant problems that you can help with. They need to set up a better sales pipeline. They’re struggling to find the right people to hire. They’re reaching the limits of email and need better solutions to manage leads, sales, and support. They’ve taken on too much work for themselves and are trying like mad to delegate.

And they’d be grateful for your help.

I came into coaching looking around at the going rates for business coaching, and deciding to come in at the top of that range. Not only because I knew I’d attract the right people–but because I knew the hardest part would be getting people to write me that check at all, no matter what the number was.

The Real Reason Most People Don’t Charge More

I also charged more for one other reason, and that was because it forces me to be a better coach. At these rates, clients have high expectations (and well they should!) “But shouldn’t you charge less because you’re just starting out?” No, I should charge more because it’s a challenge. Because it forces me to think quickly and learn on my feet.

I know I’ll be a better coach, and a better business owner, if I charge more and force myself to live up to their expectations. If the expectations become too much, I can scale back, humbly admit my mistakes, and move forward. (And I will make mistakes–as will you! That’s all part of running a successful business.)

I risk much more as a business owner by coming in and charging at the top of the spectrum. And that gets down to the real reason “Charge less so you can build up a client base” is such popular advice, even though it will have devastating consequences in most businesses. It is because we don’t feel worthy of charging more. We don’t feel like we deserve to charge that much. So we charge less out of a deep-seated fear.

I’ve just told you that people don’t buy software based on features. (A dirty secret.) But I still see many business owners compare their company to some software that’s been on the market for 10 years and say, “Well, they have more features.” Great! But do they have the features your clients need? Find the clients who need 1/10 of those features and charge them a premium price just for that. Offer clients the opportunity to give feedback. Go out and be the 37signals in your industry. Build the UNIX philosophy into your company–do one thing and do it well!

What’s the worst-case scenario? No one will pay what you charge. You’ll probably find a customer even at the top rate you can charge, but just in case you don’t, look closely at who you’re having conversations with. Are you talking with broke startups who are stretching to pay $50/month for web hosting? Are you talking with wannabe business owners who’d have to pay the money for your business out of their own pockets?

Are those the customers you really want? Or can you find the one person who’s willing to pay 10x what all those other people are willing to pay, because you’ve hit the nail on the head with the exact solution for the problem he or she is trying to solve?

I don’t want everyone as a client. I want a few awesome business rock stars. And truthfully, that’s all you need to get started, too. Stop letting fear run your business–and by extension, your pricing. Charge more. And when people say “Charge less so you can build up a client base”, feel free to refer them to this blog post so they can charge more, too!

The intersection of finding a handful of great clients and charging them a premium is where successful businesses are built. It takes a lot of courage that most people don’t have. It’s time for you to put away that fear and build a business at that crossroads.

Here’s to courage, and charging more!

Copyright © 2008
This feed is for personal, non-commercial use only.
The use of this feed on other websites breaches copyright. If this content is not in your news reader, it makes the page you are viewing an infringement of the copyright. (Digital Fingerprint:
ca01ca7aefbdcac4b8bbfff1994a3b42)

The post Why “Charge Less” is Almost Always the Wrong Advice appeared first on Starting Your Own Business with Erica Douglass.

Categories: LinuxChix bloggers

Global key bindings in Emacs

Akkana Peck - Sun, 2014-09-14 22:46

Global key bindings in emacs. What's hard about that, right? Just something simple like (global-set-key "\C-m" 'newline-and-indent) and you're all set.

Well, no. global-set-key gives you a nice key binding that works ... until the next time you load a mode that wants to redefine that key binding out from under you.

For many years I've had a huge collection of mode hooks that run when specific modes load. For instance, python-mode defines \C-c\C-r, my binding that normally runs revert-buffer, to do something called run-python. I never need to run python inside emacs -- I do that in a shell window. But I fairly frequently want to revert a python file back to the last version I saved. So I had a hook that ran whenever python-mode loaded to override that key binding and set it back to what I'd already set it to: (defun reset-revert-buffer () (define-key python-mode-map "\C-c\C-r" 'revert-buffer) ) (setq python-mode-hook 'reset-revert-buffer)

That worked fine -- but you have to do it for every mode that overrides key bindings and every binding that gets overridden. It's a constant chase, where you keep needing to stop editing whatever you wanted to edit and go add yet another mode-hook to .emacs after chasing down which mode is causing the problem. There must be a better solution.

A web search quickly led me to the StackOverflow discussion Globally override key bindings. I tried the techniques there; but they didn't work.

It took a lot of help from the kind folks on #emacs, but after an hour or so they finally found the key: emulation-mode-map-alists. It's only barely documented -- the key there is "The “active” keymaps in each alist are used before minor-mode-map-alist and minor-mode-overriding-map-alist" -- and there seem to be no examples anywhere on the web for how to use it. It's a list of alists mapping names to keymaps. Oh, clears it right up! Right?

Okay, here's what it means. First you define a new keymap and add your bindings to it: (defvar global-keys-minor-mode-map (make-sparse-keymap) "global-keys-minor-mode keymap.") (define-key global-keys-minor-mode-map "\C-c\C-r" 'revert-buffer) (define-key global-keys-minor-mode-map (kbd "C-;") 'insert-date)

Now define a minor mode that will use that keymap. You'll use that minor mode for basically everything. (define-minor-mode global-keys-minor-mode "A minor mode so that global key settings override annoying major modes." t "global-keys" 'global-keys-minor-mode-map) (global-keys-minor-mode 1)

Now build an alist consisting of a list containing a single dotted pair: the name of the minor mode and the keymap. ;; A keymap that's supposed to be consulted before the first ;; minor-mode-map-alist. (defconst global-minor-mode-alist (list (cons 'global-keys-minor-mode global-keys-minor-mode-map)))

Finally, set emulation-mode-map-alists to a list containing only the global-minor-mode-alist. (setf emulation-mode-map-alists '(global-minor-mode-alist))

There's one final step. Even though you want these bindings to be global and work everywhere, there is one place where you might not want them: the minibuffer. To be honest, I'm not sure if this part is necessary, but it sounds like a good idea so I've kept it. (defun my-minibuffer-setup-hook () (global-keys-minor-mode 0)) (add-hook 'minibuffer-setup-hook 'my-minibuffer-setup-hook)

Whew! It's a lot of work, but it'll let me clean up my .emacs file and save me from endlessly adding new mode-hooks.

Categories: LinuxChix bloggers

How to Build a Linux Media Server

Carla Schroder (O'Reilly articles) - Sat, 2014-09-13 22:15
Just about any Linux makes an excellent media server because it's lightweight and stable, so you can use whatever flavor you're most comfortable with. Any Ubuntu variant (Ubuntu, Xubuntu, Lubuntu, and so on) is exceptionally nice to set up as a media server because they make it easy to get restricted codecs. I have Xubuntu running on a ZaReason MediaBox. This is a simple system for playing movies and music. It is not a DVR (digital video recorder), and it doesn't need a TV tuner because I don't have any broadcast TV. No cable, satellite, nor over-the-air even. Don't want it and don't miss it. But if that's something you want you may have it, because Linux wants us to be happy.
Categories: LinuxChix bloggers

Making emailed LinkedIn discussion thread links actually work

Akkana Peck - Thu, 2014-09-11 19:10

I don't use web forums, the kind you have to read online, because they don't scale. If you're only interested in one subject, then they work fine: you can keep a browser tab for your one or two web forums perenially open and hit reload every few hours to see what's new. If you're interested in twelve subjects, each of which has several different web forums devoted to it -- how could you possibly keep up with that? So I don't bother with forums unless they offer an email gateway, so they'll notify me by email when new discussions get started, without my needing to check all those web pages several times per day.

LinkedIn discussions mostly work like a web forum. But for a while, they had a reasonably usable email gateway. You could set a preference to be notified of each new conversation. You still had to click on the web link to read the conversation so far, but if you posted something, you'd get the rest of the discussion emailed to you as each message was posted. Not quite as good as a regular mailing list, but it worked pretty well. I used it for several years to keep up with the very active Toastmasters group discussions.

About a year ago, something broke in their software, and they lost the ability to send email for new conversations. I filed a trouble ticket, and got a note saying they were aware of the problem and working on it. I followed up three months later (by filing another ticket -- there's no way to add to an existing one) and got a response saying be patient, they were still working on it. 11 months later, I'm still being patient, but it's pretty clear they have no intention of ever fixing the problem.

Just recently I fiddled with something in my LinkedIn prefs, and started getting "Popular Discussions" emails every day or so. The featured "popular discussion" is always something stupid that I have no interest in, but it's followed by a section headed "Other Popular Discussions" that at least gives me some idea what's been posted in the last few days. Seemed like it might be worth clicking on the links even though it means I'd always be a few days late responding to any conversations.

Except -- none of the links work. They all go to a generic page with a red header saying "Sorry it seems there was a problem with the link you followed."

I'm reading the plaintext version of the mail they send out. I tried viewing the HTML part of the mail in a browser, and sure enough, those links worked. So I tried comparing the text links with the HTML: Text version: http://www.linkedin.com/e/v2?e=3x1l-hzwzd1q8-6f&t=gde&midToken=AQEqep2nxSZJIg&ek=b2_anet_digest&li=82&m=group_discussions&ts=textdisc-6&itemID=5914453683503906819&itemType=member&anetID=98449 HTML version: http://www.linkedin.com/e/v2?e=3x1l-hzwzd1q8-6f&t=gde&midToken=AQEqep2nxSZJIg&ek=b2_anet_digest&li=17&m=group_discussions&ts=grouppost-disc-6&itemID=5914453683503906819&itemType=member&anetID=98449

Well, that's clear as mud, isn't it? HTML entity substitution

I pasted both links one on top of each other, to make it easier to compare them one at a time. That made it fairly easy to find the first difference: Text version: http://www.linkedin.com/e/v2?e=3x1l-hzwzd1q8-6f&t=gde&midToken= ... HTML version: http://www.linkedin.com/e/v2?e=3x1l-hzwzd1q8-6f&t=gde&midToken= ...

Time to die laughing: they're doing HTML entity substitution on the plaintext part of their email notifications, changing & to & everywhere in the link.

If you take the link from the text email and replace & with &, the link works, and takes you to the specific discussion. Pagination

Except you can't actually read the discussion. I went to a discussion that had been open for 2 days and had 35 responses, and LinkedIn only showed four of them. I don't even know which four they are -- are they the first four, the last four, or some Facebook-style "four responses we thought you'd like". There's a button to click on to show the most recent entries, but then I only see a few of the most recent responses, still not the whole thread.

Hooray for the web -- of course, plenty of other people have had this problem too, and a little web searching unveiled a solution. Add a pagination token to the end of the URL that tells LinkedIn to show 1000 messages at once. &count=1000&paginationToken= It won't actually show 1000 (or all) responses -- but if you start at the beginning of the page and scroll down reading responses one by one, it will auto-load new batches. Yes, infinite scrolling pages can be annoying, but at least it's a way to read a LinkedIn conversation in order. Making it automatic

Okay, now I know how to edit one of their URLs to make it work. Do I want to do that by hand any time I want to view a discussion? Noooo!

Time for a script! Since I'll be selecting the URLs from mutt, they'll be in the X PRIMARY clipboard. And unfortunately, mutt adds newlines so I might as well strip those as well as fixing the LinkedIn problems. (Firefox will strip newlines for me when I paste in a multi-line URL, but why rely on that?)

Here's the important part of the script: import subprocess, gtk primary = gtk.clipboard_get(gtk.gdk.SELECTION_PRIMARY) if not primary.wait_is_text_available() : sys.exit(0) link = primary.wait_for_text() link = link.replace("\n", "").replace("&", "&") + \ "&count=1000&paginationToken=" subprocess.call(["firefox", "-new-tab", link])

And here's the full script: linkedinify on GitHub. I also added it to pyclip, the script I call from Openbox to open a URL in Firefox when I middle-click on the desktop.

Now I can finally go back to participating in those discussions.

Categories: LinuxChix bloggers

Accessible KDE, Kubuntu

Valorie Zimmerman 2 - Thu, 2014-09-11 10:31
KDE is community. We welcome everyone, and make our software work for everyone. So, accessibility is central to all our work, in the community, in testing, coding, documentation. Frederik has been working to make this true in Qt and in KDE for many years, Peter has done valuable work with Simon and Jose is doing testing and some patches to fix stuff.

However, now that KF5 is rolling out, we're finding a few problems with our KDE software such as widgets, KDE configuration modules (kcm) and even websites. However, the a11y team is too small to handle all this! Obviously, we need to grow the team.

So we've decided to make heavier use of the forums, where we might find new testers and folks to fix the problems, and perhaps even people to fix up the https://accessibility.kde.org/ website to be as
awesome as the KDE-Edu site. The Visual Design Group are the leaders here, and they are awesome!

Please drop by #kde-accessibility on Freenode or the Forum https://forum.kde.org/viewforum.php?f=216 to read up on what needs doing, and learn how to test. People stepping up to learn forum
moderation are also welcome. Frederik has recently posted about the BoF: https://forum.kde.org/viewtopic.php?f=216&t=122808

A11y was a topic in the Kubuntu BoF today, and we're going to make a new push to make sure our accessibility options work well out of the box, i.e. from first boot. This will involve working with the Ubuntu a11y team, yeah!

More information is available at
https://community.kde.org/Accessibility and
https://userbase.kde.org/Applications/Accessibility
Categories: LinuxChix bloggers

Project for Albuquerque Mini Maker Faire

Terri - Tue, 2014-09-09 16:31
This is crossposted from Curiousity.ca, my personal maker blog. If you want to link to this post, please use the original link since the formatting there is usually better.

In that way that we have, John and I are working together on a last-minute project for our next event, the Albuquerque Mini Maker Faire. I’m too tired to write a whole lot of text, so I took some photos instead. With no explanation, can you tell what is starting to take shape in our house?


20140909-IMG_4790.jpg20140909-IMG_4792.jpg20140909-IMG_4798.jpg

20140909-IMG_4799.jpg20140909-IMG_4800.jpg20140909-IMG_4806.jpg20140909-IMG_4809.jpg

20140909-IMG_4810.jpg20140909-IMG_4812.jpg

20140909-IMG_4813.jpg20140909-IMG_4816.jpg



comment count unavailable comments

Fixing mistakes and growing stronger

Valorie Zimmerman 2 - Tue, 2014-09-09 07:11
In Creativity, Inc., Catmull explores an example of where their structure had created some problems, and how they identified and fixed that, improving their over-all culture. I know this is a wall of text, but Catmull asks excellent questions. I felt it was worthwhile to copy for you. He says,
Improvements didn't happen overnight. But by the time we finished A Bug's Life, the production managers were no longer seen as impediments to creative process, but as peers--as first-class citizens. We had become better. 
This was success in itself, but it came with an added and unexpected benefit: The act of thinking about the problem and responding to it was invigorating and rewarding. We realized that our purpose was not merely to build a studio that made hit films but to foster a creative culture that would continually ask questions. Questions like: If we had done some things right to achieve success how could we ensure that we understood what those things were? Could we replicate them on our next projects? Perhaps as important, was replication of success even the right thing to do? How many serious, potentially disastrous problems were lurking just out of sight and threatening to undo us? What, if anything, could we do to bring the to light? How much of our success was luck? What would happen to our egos if we continued to succeed? Would they grow so large they could hurt us, and if so, what could we do to address that overconfidence? What dynamics would arise now that we were bringing new people into a successful enterprise as opposed to a struggling startup?

What had drawn me to science, all those years ago, was the search for understanding. Human interaction is far more complex than relativity or string theory, of course, but that only made it more interesting and important; it constantly challenged my presumptions.... Figuring out how to build a sustainable creative culture--one that didn't just pay lip service to the importance of things like honesty, excellence, communication, originality, and self-assessment but really *committed* to them, no matter how uncomfortable that became--wasn't a singular assignment....

As I saw it, our mandate was to foster a culture that would seek to keep our sightlines clear, even as we accepted that we were often trying to engage with and fix what we could not see. My hope was to make this culture so vigorous that it would survive when Pixar's founding members were long gone. [p. 64-5]
Again, I see an almost perfect match between their task and ours, where ours=KDE e.V.. In the Community Working Group (CWG) in particular, I see my task as essentially gardening. This includes improving the soil, weeding, but never removing valuable little shoots which can grow into exciting new directions for the community. Of course I can't carry the metaphor too far, since others do the planting. But we can keep the conditions for growth optimal with our work.

In the documentation workshop yesterday, we explored the current state of the KDE documentation, how we can improve access, and grow the documentation team again. We also found some large choke points, which includes KDE.org. We really need a web team! KDE.org is valuable real estate on the web, which has been neglected for too long. More about that later.....

For now, looking forward to another day of hard work and fun in Brno!


Categories: LinuxChix bloggers

Dot Reminders

Akkana Peck - Mon, 2014-09-08 03:10

I read about cool computer tricks all the time. I think "Wow, that would be a real timesaver!" And then a week later, when it actually would save me time, I've long since forgotten all about it.

After yet another session where I wanted to open a frequently opened file in emacs and thought "I think I made a bookmark for that a while back", but then decided it's easier to type the whole long pathname rather than go re-learn how to use emacs bookmarks, I finally decided I needed a reminder system -- something that would poke me and remind me of a few things I want to learn.

I used to keep cheat sheets and quick reference cards on my desk; but that never worked for me. Quick reference cards tend to be 50 things I already know, 40 things I'll never care about and 4 really great things I should try to remember. And eventually they get burned in a pile of other papers on my desk and I never see them again.

My new system is working much better. I created a file in my home directory called .reminders, in which I put a few -- just a few -- things I want to learn and start using regularly. It started out at about 6 lines but now it's grown to 12.

Then I put this in my .zlogin (of course, you can do this for any shell, not just zsh, though the syntax may vary): if [[ -f ~/.reminders ]]; then cat ~/.reminders fi

Now, in every login shell (which for me is each new terminal window I create on my desktop), I see my reminders. Of course, I don't read them every time; but I look at them often enough that I can't forget the existence of great things like emacs bookmarks, or diff <(cmd1) <(cmd2).

And if I forget the exact keystroke or syntax, I can always cat ~/.reminders to remind myself. And after a few weeks of regular use, I finally have internalized some of these tricks, and can remove them from my .reminders file.

It's not just for tech tips, either; I've used a similar technique for reminding myself of hard-to-remember vocabulary words when I was studying Spanish. It could work for anything you want to teach yourself.

Although the details of my .reminders are specific to Linux/Unix and zsh, of course you could use a similar system on any computer. If you don't open new terminal windows, you can set a reminder to pop up when you first log in, or once a day, or whatever is right for you. The important part is to have a small set of tips that you see regularly.

Categories: LinuxChix bloggers

Simcoe’s August 2014 Checkup

Elizabeth Krumbach - Mon, 2014-09-08 00:57

This upcoming December will mark Simcoe living with the CRF diagnosis for 3 years. We’re happy to say that she continues to do well, with this latest batch of blood work showing more good news about her stable levels.

Unfortunately we brought her in a few weeks early this time following a bloody sneeze. As I’ve written earlier this year, they’ve both been a bit sneezy this year with an as yet undiagnosed issue that has been eluding all tests. Every month or so they switch off who is sneezing, but this was the first time there was any blood.

Simcoe at vet
“I still don’t like vet visits.”

Following the exam, the vet said she wasn’t worried. The bleeding was a one time thing and could have just been caused by rawness brought on by the sneezing and sniffles. Since the appointment on August 26th we haven’t seen any more problems (and the cold seems to have migrated back to Caligula).

As for her levels, it was great to see her weight come up a bit, from 9.62 to 9.94lbs.

Her BUN and CRE levels have both shifted slightly, from 51 to 59 on BUN and 3.9 to 3.8 on CRE.

BUN: 59 (normal range: 14-36)
CRE: 3.8 (normal range: .6-2.4)

Categories: LinuxChix bloggers

Late posting: Heading to Brno for Akademy!

Valorie Zimmerman 2 - Sun, 2014-09-07 07:21
So excited to be in the air over Seattle, heading toward Vienna and Brno, and Akademy! Beside me is Scarlett Clark, who will be attending her first Akademy, and first Kubuntu meeting. We've both been sponsored by Ubuntu for the costs of travel; thank you! Scarlett was telling me, as we waited to board our first flight, how long she looked for a place to contribute to a Linux community. She said she tried for years, in many distributions, on mail lists and in IRC. What she was told was "do something." How does a first-time contributor know what is needed, where to ask, and how to make that crucial first step?

I was glad to hear that once she found the KDE-doc-english mail list, that she was encouraged to stick around, get onto IRC, and guided every step of the way. I was also happy to hear that Yuri, Sune and Jonathan Riddell all made her feel welcome, and showed her where to find the information she needed to make her contributions high quality. When Scarlett showed up in #kubuntu-devel offering to learn to package, I was over the moon with happiness. I really love to see more women involved in free and open source, and especially in KDE and Kubuntu, my Linux home.

I was a bit sad that the Debian community was not welcoming to her, with Sune the one bright spot. Yeah SUNE! (By the way, hire him!) I think she will find a nice home there as well, however, if our plans to do some common packaging between Kubuntu and Debian works out in the future. It was interesting to see the blog by the developers of systemd discussing the same issue we've been considering; the waste of time packaging the same applications and other stuff over and over again. So much wasted work, when we could really be using our time more productively. Rather than working harder, let's work smarter! Check out their blog for their take on the issue: http://0pointer.net/blog/revisiting-how-we-put-together-linux-systems.html

Welcome to Scarlett, who is planning to get her blog up and running again, and on the planets. She'll be saying more about these subjects in the future. Scarlett, and all you other first-time Akademy attendees, a hearty hug of greeting. Have a wonderful time! See me in person for a real hug!

PS: I couldn't post this until now, Sunday morning. The Debian folks here, especially Pinotree have been great! I look forward to our meeting with them on Thursday morning.
Categories: LinuxChix bloggers

Creativity and KDE

Valorie Zimmerman 2 - Sat, 2014-09-06 07:16
Creativity Inc., by Ed Catmull, President of Pixar

My book to read for this trip finally arrived from the library last week, and I could hardly wait to dip into it. I see a profound parallel between the work we do in KDE, and the experiences Catmull recounts in his book. He's structures it as "lessons learned" as he lead one of the most creative teams in both entertainment and in technology. His dream was always to marry the two fields, which he has done brilliantly at Pixar. He tried to make a place where you don't have to ask permission to take responsibility. [p. 51]

Always take a chance on better, even if it seems threatening Catmull says on p. 23. When he hired a person he deemed more qualified for his job than he was, the risk paid off both creatively and personally. Playing it safe is what humans tend to do far too often, especially after they have become successful. Our stone age brains hate to lose, more than they like to win big. Knowing this about ourselves sometimes gives us the courage to 'go big' rather than 'go home.' I have seen us follow this advice in the past year or two, and I hope we have the courage to continue on our brave course.

However, experience showed Catmull that being confident about the value of innovation was not enough. We needed buy-in from the community we were trying to serve.[p 31] My observation is that the leaders in the KDE community have learned this lesson very well. The collaborative way we develop new ideas, new products, new processes helps get that buy in. However, we're not perfect. We often lack knowledge of our "end users" -- not our fellow community members, but some of the millions of students, tech workers and just plain computer users. How often do teams schedule testing sessions where they watch users as they try to accomplish tasks using our software? I know we do it, and we need to do it more often.

Some sources rate us as the largest FOSS community. This can be seen as success. This achievement can have hidden dangers, however. When Catmull ran into trouble, in spite of his 'open door' management style, he found that the good stuff was hiding the bad stuff.... When downsides coexist with upsides, as they often do, people are reluctant to explore what's bugging them for fear of being labeled complainers.[p. 63] This is really dangerous. Those downsides are poison, and they must be exposed to the light, dealt with, fixed, or they will destroy a community or a part of a community. On the upside, the KDE community created the Community Working Group (CWG), and empowered us to do our job properly. On the downside, often people hide their misgivings, their irritations, their fears, until they explode. Not only does such an explosion shock the people surrounding the damage, but it shocks the person exploding as well. And afterwards, the most we can do is often damage control, rather than helping the team grow healthier, and find more creative ways to deal with those downsides.

Another danger is that even the smartest people can form an ineffective team if they are mis-matched. Focus on how a team is performing, not on the talents of the individuals within it.... Getting the right people and the right chemistry is more important than getting the right idea.[p. 74] One of the important strengths of FOSS teams, and KDE teams in particular, is that people feel free to come and go. If anyone feels walled out, or trapped in, we need to remove those barriers. When people are working with those who feed their energy and they in turn can pass it along. When the current stops flowing, it's time to do something different. Of course this prevents burnout, but more important, it keeps teams feeling alive, energetic, and fun. Find, develop, and support good people, and they will find, develop, and own good ideas."[p. 76] I think we instinctively know in KDE that good ideas are common. What is unusual is someone else stepping up to make those "good ideas" we are often given, to make them happen. Instead, the great stuff happens when someone has an itch, and decides to scratch it, and draws others to help her make that vision become reality. 

The final idea I want to present in this post is directed to all the leaders in KDE. This doesn't mean just the board of the e.V., by the way. The leaders in KDE are those who have volunteered to maintain packages, mentor students, moderate the mail lists and forums, become channel ops in IRC, write the promo articles, release notes and announcements, do the artwork, write the documentation, keeps the wikis accurate, helpful and free of spam, organize sprints and other meetings such as Akademy, translate our docs and internationalize our software, design and build-in accessibility, staff booths, and many other responsibilities such as serving on working groups and other committees. This is a shared responsibility we carry to one another, and what keeps our community healthy.

It is management's job to take the long view, to intervene and protect our people from their willingness to pursue excellence at all costs. Not to do so would be irresponsible.... If we are in this for the long haul, we have to take care of ourselves, support healthy habits and encourage our employees to have fullfilling lives outside of work. [p. 77] This is the major task of the e.V. and especially the Board, in my opinion, and of course the task of the CWG as well. 

Isn't this stuff great!? I'll be writing more blog posts inspired by this book as I get further into it. 
Categories: LinuxChix bloggers

Using strace to find configuration file locations

Akkana Peck - Tue, 2014-09-02 19:06

I was using strace to figure out how to set up a program, lftp, and a friend commented that he didn't know how to use it and would like to learn. I don't use strace often, but when I do, it's indispensible -- and it's easy to use. So here's a little tutorial.

My problem, in this case, was that I needed to find out what configuration file I needed to modify in order to set up an alias in lftp. The lftp man page tells you how to define an alias, but doesn't tell you how to save it for future sessions; apparently you have to edit the configuration file yourself.

But where? The man page suggested a couple of possible config file locations -- ~/.lftprc and ~/.config/lftp/rc -- but neither of those existed. I wanted to use the one that already existed. I had already set up bookmarks in lftp and it remembered them, so it must have a config file already, somewhere. I wanted to find that file and use it.

So the question was, what files does lftp read when it starts up? strace lets you snoop on a program and see what it's doing.

strace shows you all system calls being used by a program. What's a system call? Well, it's anything in section 2 of the Unix manual. You can get a complete list by typing: man 2 syscalls (you may have to install developer man pages first -- on Debian that's the manpages-dev package). But the important thing is that most file access calls -- open, read, chmod, rename, unlink (that's how you remove a file), and so on -- are system calls.

You can run a program under strace directly: $ strace lftp sitename Interrupt it with Ctrl-C when you've seen what you need to see. Pruning the output

And of course, you'll see tons of crap you're not interested in, like rt_sigaction(SIGTTOU) and fcntl64(0, F_GETFL). So let's get rid of that first. The easiest way is to use grep. Let's say I want to know every file that lftp opens. I can do it like this: $ strace lftp sitename |& grep open

I have to use |& instead of just | because strace prints its output on stderr instead of stdout.

That's pretty useful, but it's still too much. I really don't care to know about strace opening a bazillion files in /usr/share/locale/en_US/LC_MESSAGES, or libraries like /usr/lib/i386-linux-gnu/libp11-kit.so.0.

In this case, I'm looking for config files, so I really only want to know which files it opens in my home directory. Like this: $ strace lftp sitename |& grep 'open.*/home/akkana'

In other words, show me just the lines that have either the word "open" or "read" followed later by the string "/home/akkana". Digression: grep pipelines

Now, you might think that you could use a simpler pipeline with two greps: $ strace lftp sitename |& grep open | grep /home/akkana

But that doesn't work -- nothing prints out. Why? Because grep, under certain circumstances that aren't clear to me, buffers its output, so in some cases when you pipe grep | grep, the second grep will wait until it has collected quite a lot of output before it prints anything. (This comes up a lot with tail -f as well.) You can avoid that with $ strace lftp sitename |& grep --line-buffered open | grep /home/akkana but that's too much to type, if you ask me. Back to that strace | grep

Okay, whichever way you grep for open and your home directory, it gives: open("/home/akkana/.local/share/lftp/bookmarks", O_RDONLY|O_LARGEFILE) = 5 open("/home/akkana/.netrc", O_RDONLY|O_LARGEFILE) = -1 ENOENT (No such file or directory) open("/home/akkana/.local/share/lftp/rl_history", O_RDONLY|O_LARGEFILE) = 5 open("/home/akkana/.inputrc", O_RDONLY|O_LARGEFILE) = 5 Now we're getting somewhere! The file where it's getting its bookmarks is ~/.local/share/lftp/bookmarks -- and I probably can't use that to set my alias.

But wait, why doesn't it show lftp trying to open those other config files? Using script to save the output

At this point, you might be sick of running those grep pipelines over and over. Most of the time, when I run strace, instead of piping it through grep I run it under script to save the whole output.

script is one of those poorly named, ungoogleable commands, but it's incredibly useful. It runs a subshell and saves everything that appears in that subshell, both what you type and all the output, in a file.

Start script, then run lftp inside it: $ script /tmp/lftp.strace Script started on Tue 26 Aug 2014 12:58:30 PM MDT $ strace lftp sitename

After the flood of output stops, I type Ctrl-D or Ctrl-C to exit lftp, then another Ctrl-D to exit the subshell script is using. Now all the strace output was in /tmp/lftp.strace and I can grep in it, view it in an editor or anything I want.

So, what files is it looking for in my home directory and why don't they show up as open attemps? $ grep /home/akkana /tmp/lftp.strace

Ah, there it is! A bunch of lines like this: access("/home/akkana/.lftprc", R_OK) = -1 ENOENT (No such file or directory) stat64("/home/akkana/.lftp", 0xbff821a0) = -1 ENOENT (No such file or directory) mkdir("/home/akkana/.config", 0755) = -1 EEXIST (File exists) mkdir("/home/akkana/.config/lftp", 0755) = -1 EEXIST (File exists) access("/home/akkana/.config/lftp/rc", R_OK) = 0

So I should have looked for access and stat as well as open. Now I have the list of files it's looking for. And, curiously, it creates ~/.config/lftp if it doesn't exist already, even though it's not going to write anything there.

So I created ~/.config/lftp/rc and put my alias there. Worked fine. And I was able to edit my bookmark in ~/.local/share/lftp/bookmarks later when I had a need for that. All thanks to strace.

Categories: LinuxChix bloggers

Bash Arrays

Renata - Wed, 2014-08-27 17:47

Arrays are helpful, and I’ll give some examples for reference. They can be a little bit confusing, but once you get used to them, it’s easy!

First you initialize the arrays

cat[1]="Bub"
cat[2]="Grumpy"
cat[3]="Luna"

feat[1]="cute"
feat[2]="terrible"
feat[3]="fashion"

Then you use them as you wish. You can, at first, just list them individually

echo "${cat[3]} is ${feat[1]}"

or list all of the items in a specific array
echo “Cats I like: ${cat[@]}”

Something like that would also work:

for i in {1..3}
do
echo "${cat[i]} is ${feat[i]}!"
done

That opens many possibilities. Life is not only about internet cats (although it sometimes seems so).

Make good use of your arrays, they’re great!

(I takes me 8 months to update the site and I write a silly post about bash arrays, I know. Sorry, I was thinking about them.)

Categories: LinuxChix bloggers

Ditch Agile, Go With Common Sense

L J Laubenheimer (Iconoclast Blast) - Tue, 2014-07-15 17:40
I am so sick of Agile I could puke. Agile "methods" and "processes" are often used as a bludgeon to enforce the great speedup, doing more, faster, with fewer resources. I see estimations forced into the PM or manager's demanded hard deadline, hours getting longer because of wasted time in meetings, and "rapid" deployment of garbage code that needs to be rolled back because no integration testing was done (eliminating QA does that to you.)
Categories: LinuxChix bloggers

How To ***REALLY*** Advocate for the Customer

L J Laubenheimer (Iconoclast Blast) - Tue, 2014-07-15 17:32
I occasionally see job ads for "customer advocates" or "customer evangelists". They all turn out to be sales and marketing, that is, advocating or evangelizing stuff to the would-be user. That is so ass-backwards that it makes me foam at the mouth.
Categories: LinuxChix bloggers

Haecksen organisers mailing list

Oceania Women of Open Tech - Mon, 2014-05-26 03:54

With OWOOT closing, the OWOOT list will no longer be available to volunteers organising the Haecksen miniconference at linux.conf.au.

If you're interested in helping out with Haecksen in future years, please join the new Haecksen organisers mailing list hosted by Linux Australia.

Categories: News about LinuxChix
Syndicate content