Thu, 1970-01-01 00:00

Global key bindings in Emacs

Akkana Peck - Sun, 2014-09-14 22:46

Global key bindings in emacs. What's hard about that, right? Just something simple like (global-set-key "\C-m" 'newline-and-indent) and you're all set.

Well, no. global-set-key gives you a nice key binding that works ... until the next time you load a mode that wants to redefine that key binding out from under you.

For many years I've had a huge collection of mode hooks that run when specific modes load. For instance, python-mode defines \C-c\C-r, my binding that normally runs revert-buffer, to do something called run-python. I never need to run python inside emacs -- I do that in a shell window. But I fairly frequently want to revert a python file back to the last version I saved. So I had a hook that ran whenever python-mode loaded to override that key binding and set it back to what I'd already set it to: (defun reset-revert-buffer () (define-key python-mode-map "\C-c\C-r" 'revert-buffer) ) (setq python-mode-hook 'reset-revert-buffer)

That worked fine -- but you have to do it for every mode that overrides key bindings and every binding that gets overridden. It's a constant chase, where you keep needing to stop editing whatever you wanted to edit and go add yet another mode-hook to .emacs after chasing down which mode is causing the problem. There must be a better solution.

A web search quickly led me to the StackOverflow discussion Globally override key bindings. I tried the techniques there; but they didn't work.

It took a lot of help from the kind folks on #emacs, but after an hour or so they finally found the key: emulation-mode-map-alists. It's only barely documented -- the key there is "The “active” keymaps in each alist are used before minor-mode-map-alist and minor-mode-overriding-map-alist" -- and there seem to be no examples anywhere on the web for how to use it. It's a list of alists mapping names to keymaps. Oh, clears it right up! Right?

Okay, here's what it means. First you define a new keymap and add your bindings to it: (defvar global-keys-minor-mode-map (make-sparse-keymap) "global-keys-minor-mode keymap.") (define-key global-keys-minor-mode-map "\C-c\C-r" 'revert-buffer) (define-key global-keys-minor-mode-map (kbd "C-;") 'insert-date)

Now define a minor mode that will use that keymap. You'll use that minor mode for basically everything. (define-minor-mode global-keys-minor-mode "A minor mode so that global key settings override annoying major modes." t "global-keys" 'global-keys-minor-mode-map) (global-keys-minor-mode 1)

Now build an alist consisting of a list containing a single dotted pair: the name of the minor mode and the keymap. ;; A keymap that's supposed to be consulted before the first ;; minor-mode-map-alist. (defconst global-minor-mode-alist (list (cons 'global-keys-minor-mode global-keys-minor-mode-map)))

Finally, set emulation-mode-map-alists to a list containing only the global-minor-mode-alist. (setf emulation-mode-map-alists '(global-minor-mode-alist))

There's one final step. Even though you want these bindings to be global and work everywhere, there is one place where you might not want them: the minibuffer. To be honest, I'm not sure if this part is necessary, but it sounds like a good idea so I've kept it. (defun my-minibuffer-setup-hook () (global-keys-minor-mode 0)) (add-hook 'minibuffer-setup-hook 'my-minibuffer-setup-hook)

Whew! It's a lot of work, but it'll let me clean up my .emacs file and save me from endlessly adding new mode-hooks.

Categories: LinuxChix bloggers

How to Build a Linux Media Server

Carla Schroder (O'Reilly articles) - Sat, 2014-09-13 22:15
Just about any Linux makes an excellent media server because it's lightweight and stable, so you can use whatever flavor you're most comfortable with. Any Ubuntu variant (Ubuntu, Xubuntu, Lubuntu, and so on) is exceptionally nice to set up as a media server because they make it easy to get restricted codecs. I have Xubuntu running on a ZaReason MediaBox. This is a simple system for playing movies and music. It is not a DVR (digital video recorder), and it doesn't need a TV tuner because I don't have any broadcast TV. No cable, satellite, nor over-the-air even. Don't want it and don't miss it. But if that's something you want you may have it, because Linux wants us to be happy.
Categories: LinuxChix bloggers

Making emailed LinkedIn discussion thread links actually work

Akkana Peck - Thu, 2014-09-11 19:10

I don't use web forums, the kind you have to read online, because they don't scale. If you're only interested in one subject, then they work fine: you can keep a browser tab for your one or two web forums perenially open and hit reload every few hours to see what's new. If you're interested in twelve subjects, each of which has several different web forums devoted to it -- how could you possibly keep up with that? So I don't bother with forums unless they offer an email gateway, so they'll notify me by email when new discussions get started, without my needing to check all those web pages several times per day.

LinkedIn discussions mostly work like a web forum. But for a while, they had a reasonably usable email gateway. You could set a preference to be notified of each new conversation. You still had to click on the web link to read the conversation so far, but if you posted something, you'd get the rest of the discussion emailed to you as each message was posted. Not quite as good as a regular mailing list, but it worked pretty well. I used it for several years to keep up with the very active Toastmasters group discussions.

About a year ago, something broke in their software, and they lost the ability to send email for new conversations. I filed a trouble ticket, and got a note saying they were aware of the problem and working on it. I followed up three months later (by filing another ticket -- there's no way to add to an existing one) and got a response saying be patient, they were still working on it. 11 months later, I'm still being patient, but it's pretty clear they have no intention of ever fixing the problem.

Just recently I fiddled with something in my LinkedIn prefs, and started getting "Popular Discussions" emails every day or so. The featured "popular discussion" is always something stupid that I have no interest in, but it's followed by a section headed "Other Popular Discussions" that at least gives me some idea what's been posted in the last few days. Seemed like it might be worth clicking on the links even though it means I'd always be a few days late responding to any conversations.

Except -- none of the links work. They all go to a generic page with a red header saying "Sorry it seems there was a problem with the link you followed."

I'm reading the plaintext version of the mail they send out. I tried viewing the HTML part of the mail in a browser, and sure enough, those links worked. So I tried comparing the text links with the HTML: Text version: http://www.linkedin.com/e/v2?e=3x1l-hzwzd1q8-6f&t=gde&midToken=AQEqep2nxSZJIg&ek=b2_anet_digest&li=82&m=group_discussions&ts=textdisc-6&itemID=5914453683503906819&itemType=member&anetID=98449 HTML version: http://www.linkedin.com/e/v2?e=3x1l-hzwzd1q8-6f&t=gde&midToken=AQEqep2nxSZJIg&ek=b2_anet_digest&li=17&m=group_discussions&ts=grouppost-disc-6&itemID=5914453683503906819&itemType=member&anetID=98449

Well, that's clear as mud, isn't it? HTML entity substitution

I pasted both links one on top of each other, to make it easier to compare them one at a time. That made it fairly easy to find the first difference: Text version: http://www.linkedin.com/e/v2?e=3x1l-hzwzd1q8-6f&t=gde&midToken= ... HTML version: http://www.linkedin.com/e/v2?e=3x1l-hzwzd1q8-6f&t=gde&midToken= ...

Time to die laughing: they're doing HTML entity substitution on the plaintext part of their email notifications, changing & to & everywhere in the link.

If you take the link from the text email and replace & with &, the link works, and takes you to the specific discussion. Pagination

Except you can't actually read the discussion. I went to a discussion that had been open for 2 days and had 35 responses, and LinkedIn only showed four of them. I don't even know which four they are -- are they the first four, the last four, or some Facebook-style "four responses we thought you'd like". There's a button to click on to show the most recent entries, but then I only see a few of the most recent responses, still not the whole thread.

Hooray for the web -- of course, plenty of other people have had this problem too, and a little web searching unveiled a solution. Add a pagination token to the end of the URL that tells LinkedIn to show 1000 messages at once. &count=1000&paginationToken= It won't actually show 1000 (or all) responses -- but if you start at the beginning of the page and scroll down reading responses one by one, it will auto-load new batches. Yes, infinite scrolling pages can be annoying, but at least it's a way to read a LinkedIn conversation in order. Making it automatic

Okay, now I know how to edit one of their URLs to make it work. Do I want to do that by hand any time I want to view a discussion? Noooo!

Time for a script! Since I'll be selecting the URLs from mutt, they'll be in the X PRIMARY clipboard. And unfortunately, mutt adds newlines so I might as well strip those as well as fixing the LinkedIn problems. (Firefox will strip newlines for me when I paste in a multi-line URL, but why rely on that?)

Here's the important part of the script: import subprocess, gtk primary = gtk.clipboard_get(gtk.gdk.SELECTION_PRIMARY) if not primary.wait_is_text_available() : sys.exit(0) link = primary.wait_for_text() link = link.replace("\n", "").replace("&", "&") + \ "&count=1000&paginationToken=" subprocess.call(["firefox", "-new-tab", link])

And here's the full script: linkedinify on GitHub. I also added it to pyclip, the script I call from Openbox to open a URL in Firefox when I middle-click on the desktop.

Now I can finally go back to participating in those discussions.

Categories: LinuxChix bloggers

Accessible KDE, Kubuntu

Valorie Zimmerman 2 - Thu, 2014-09-11 10:31
KDE is community. We welcome everyone, and make our software work for everyone. So, accessibility is central to all our work, in the community, in testing, coding, documentation. Frederik has been working to make this true in Qt and in KDE for many years, Peter has done valuable work with Simon and Jose is doing testing and some patches to fix stuff.

However, now that KF5 is rolling out, we're finding a few problems with our KDE software such as widgets, KDE configuration modules (kcm) and even websites. However, the a11y team is too small to handle all this! Obviously, we need to grow the team.

So we've decided to make heavier use of the forums, where we might find new testers and folks to fix the problems, and perhaps even people to fix up the https://accessibility.kde.org/ website to be as
awesome as the KDE-Edu site. The Visual Design Group are the leaders here, and they are awesome!

Please drop by #kde-accessibility on Freenode or the Forum https://forum.kde.org/viewforum.php?f=216 to read up on what needs doing, and learn how to test. People stepping up to learn forum
moderation are also welcome. Frederik has recently posted about the BoF: https://forum.kde.org/viewtopic.php?f=216&t=122808

A11y was a topic in the Kubuntu BoF today, and we're going to make a new push to make sure our accessibility options work well out of the box, i.e. from first boot. This will involve working with the Ubuntu a11y team, yeah!

More information is available at
https://community.kde.org/Accessibility and
https://userbase.kde.org/Applications/Accessibility
Categories: LinuxChix bloggers

Fixing mistakes and growing stronger

Valorie Zimmerman 2 - Tue, 2014-09-09 07:11
In Creativity, Inc., Catmull explores an example of where their structure had created some problems, and how they identified and fixed that, improving their over-all culture. I know this is a wall of text, but Catmull asks excellent questions. I felt it was worthwhile to copy for you. He says,
Improvements didn't happen overnight. But by the time we finished A Bug's Life, the production managers were no longer seen as impediments to creative process, but as peers--as first-class citizens. We had become better. 
This was success in itself, but it came with an added and unexpected benefit: The act of thinking about the problem and responding to it was invigorating and rewarding. We realized that our purpose was not merely to build a studio that made hit films but to foster a creative culture that would continually ask questions. Questions like: If we had done some things right to achieve success how could we ensure that we understood what those things were? Could we replicate them on our next projects? Perhaps as important, was replication of success even the right thing to do? How many serious, potentially disastrous problems were lurking just out of sight and threatening to undo us? What, if anything, could we do to bring the to light? How much of our success was luck? What would happen to our egos if we continued to succeed? Would they grow so large they could hurt us, and if so, what could we do to address that overconfidence? What dynamics would arise now that we were bringing new people into a successful enterprise as opposed to a struggling startup?

What had drawn me to science, all those years ago, was the search for understanding. Human interaction is far more complex than relativity or string theory, of course, but that only made it more interesting and important; it constantly challenged my presumptions.... Figuring out how to build a sustainable creative culture--one that didn't just pay lip service to the importance of things like honesty, excellence, communication, originality, and self-assessment but really *committed* to them, no matter how uncomfortable that became--wasn't a singular assignment....

As I saw it, our mandate was to foster a culture that would seek to keep our sightlines clear, even as we accepted that we were often trying to engage with and fix what we could not see. My hope was to make this culture so vigorous that it would survive when Pixar's founding members were long gone. [p. 64-5]
Again, I see an almost perfect match between their task and ours, where ours=KDE e.V.. In the Community Working Group (CWG) in particular, I see my task as essentially gardening. This includes improving the soil, weeding, but never removing valuable little shoots which can grow into exciting new directions for the community. Of course I can't carry the metaphor too far, since others do the planting. But we can keep the conditions for growth optimal with our work.

In the documentation workshop yesterday, we explored the current state of the KDE documentation, how we can improve access, and grow the documentation team again. We also found some large choke points, which includes KDE.org. We really need a web team! KDE.org is valuable real estate on the web, which has been neglected for too long. More about that later.....

For now, looking forward to another day of hard work and fun in Brno!


Categories: LinuxChix bloggers

Dot Reminders

Akkana Peck - Mon, 2014-09-08 03:10

I read about cool computer tricks all the time. I think "Wow, that would be a real timesaver!" And then a week later, when it actually would save me time, I've long since forgotten all about it.

After yet another session where I wanted to open a frequently opened file in emacs and thought "I think I made a bookmark for that a while back", but then decided it's easier to type the whole long pathname rather than go re-learn how to use emacs bookmarks, I finally decided I needed a reminder system -- something that would poke me and remind me of a few things I want to learn.

I used to keep cheat sheets and quick reference cards on my desk; but that never worked for me. Quick reference cards tend to be 50 things I already know, 40 things I'll never care about and 4 really great things I should try to remember. And eventually they get burned in a pile of other papers on my desk and I never see them again.

My new system is working much better. I created a file in my home directory called .reminders, in which I put a few -- just a few -- things I want to learn and start using regularly. It started out at about 6 lines but now it's grown to 12.

Then I put this in my .zlogin (of course, you can do this for any shell, not just zsh, though the syntax may vary): if [[ -f ~/.reminders ]]; then cat ~/.reminders fi

Now, in every login shell (which for me is each new terminal window I create on my desktop), I see my reminders. Of course, I don't read them every time; but I look at them often enough that I can't forget the existence of great things like emacs bookmarks, or diff <(cmd1) <(cmd2).

And if I forget the exact keystroke or syntax, I can always cat ~/.reminders to remind myself. And after a few weeks of regular use, I finally have internalized some of these tricks, and can remove them from my .reminders file.

It's not just for tech tips, either; I've used a similar technique for reminding myself of hard-to-remember vocabulary words when I was studying Spanish. It could work for anything you want to teach yourself.

Although the details of my .reminders are specific to Linux/Unix and zsh, of course you could use a similar system on any computer. If you don't open new terminal windows, you can set a reminder to pop up when you first log in, or once a day, or whatever is right for you. The important part is to have a small set of tips that you see regularly.

Categories: LinuxChix bloggers

Simcoe’s August 2014 Checkup

Elizabeth Krumbach - Mon, 2014-09-08 00:57

This upcoming December will mark Simcoe living with the CRF diagnosis for 3 years. We’re happy to say that she continues to do well, with this latest batch of blood work showing more good news about her stable levels.

Unfortunately we brought her in a few weeks early this time following a bloody sneeze. As I’ve written earlier this year, they’ve both been a bit sneezy this year with an as yet undiagnosed issue that has been eluding all tests. Every month or so they switch off who is sneezing, but this was the first time there was any blood.

Simcoe at vet
“I still don’t like vet visits.”

Following the exam, the vet said she wasn’t worried. The bleeding was a one time thing and could have just been caused by rawness brought on by the sneezing and sniffles. Since the appointment on August 26th we haven’t seen any more problems (and the cold seems to have migrated back to Caligula).

As for her levels, it was great to see her weight come up a bit, from 9.62 to 9.94lbs.

Her BUN and CRE levels have both shifted slightly, from 51 to 59 on BUN and 3.9 to 3.8 on CRE.

BUN: 59 (normal range: 14-36)
CRE: 3.8 (normal range: .6-2.4)

Categories: LinuxChix bloggers

Late posting: Heading to Brno for Akademy!

Valorie Zimmerman 2 - Sun, 2014-09-07 07:21
So excited to be in the air over Seattle, heading toward Vienna and Brno, and Akademy! Beside me is Scarlett Clark, who will be attending her first Akademy, and first Kubuntu meeting. We've both been sponsored by Ubuntu for the costs of travel; thank you! Scarlett was telling me, as we waited to board our first flight, how long she looked for a place to contribute to a Linux community. She said she tried for years, in many distributions, on mail lists and in IRC. What she was told was "do something." How does a first-time contributor know what is needed, where to ask, and how to make that crucial first step?

I was glad to hear that once she found the KDE-doc-english mail list, that she was encouraged to stick around, get onto IRC, and guided every step of the way. I was also happy to hear that Yuri, Sune and Jonathan Riddell all made her feel welcome, and showed her where to find the information she needed to make her contributions high quality. When Scarlett showed up in #kubuntu-devel offering to learn to package, I was over the moon with happiness. I really love to see more women involved in free and open source, and especially in KDE and Kubuntu, my Linux home.

I was a bit sad that the Debian community was not welcoming to her, with Sune the one bright spot. Yeah SUNE! (By the way, hire him!) I think she will find a nice home there as well, however, if our plans to do some common packaging between Kubuntu and Debian works out in the future. It was interesting to see the blog by the developers of systemd discussing the same issue we've been considering; the waste of time packaging the same applications and other stuff over and over again. So much wasted work, when we could really be using our time more productively. Rather than working harder, let's work smarter! Check out their blog for their take on the issue: http://0pointer.net/blog/revisiting-how-we-put-together-linux-systems.html

Welcome to Scarlett, who is planning to get her blog up and running again, and on the planets. She'll be saying more about these subjects in the future. Scarlett, and all you other first-time Akademy attendees, a hearty hug of greeting. Have a wonderful time! See me in person for a real hug!

PS: I couldn't post this until now, Sunday morning. The Debian folks here, especially Pinotree have been great! I look forward to our meeting with them on Thursday morning.
Categories: LinuxChix bloggers

Creativity and KDE

Valorie Zimmerman 2 - Sat, 2014-09-06 07:16
Creativity Inc., by Ed Catmull, President of Pixar

My book to read for this trip finally arrived from the library last week, and I could hardly wait to dip into it. I see a profound parallel between the work we do in KDE, and the experiences Catmull recounts in his book. He's structures it as "lessons learned" as he lead one of the most creative teams in both entertainment and in technology. His dream was always to marry the two fields, which he has done brilliantly at Pixar. He tried to make a place where you don't have to ask permission to take responsibility. [p. 51]

Always take a chance on better, even if it seems threatening Catmull says on p. 23. When he hired a person he deemed more qualified for his job than he was, the risk paid off both creatively and personally. Playing it safe is what humans tend to do far too often, especially after they have become successful. Our stone age brains hate to lose, more than they like to win big. Knowing this about ourselves sometimes gives us the courage to 'go big' rather than 'go home.' I have seen us follow this advice in the past year or two, and I hope we have the courage to continue on our brave course.

However, experience showed Catmull that being confident about the value of innovation was not enough. We needed buy-in from the community we were trying to serve.[p 31] My observation is that the leaders in the KDE community have learned this lesson very well. The collaborative way we develop new ideas, new products, new processes helps get that buy in. However, we're not perfect. We often lack knowledge of our "end users" -- not our fellow community members, but some of the millions of students, tech workers and just plain computer users. How often do teams schedule testing sessions where they watch users as they try to accomplish tasks using our software? I know we do it, and we need to do it more often.

Some sources rate us as the largest FOSS community. This can be seen as success. This achievement can have hidden dangers, however. When Catmull ran into trouble, in spite of his 'open door' management style, he found that the good stuff was hiding the bad stuff.... When downsides coexist with upsides, as they often do, people are reluctant to explore what's bugging them for fear of being labeled complainers.[p. 63] This is really dangerous. Those downsides are poison, and they must be exposed to the light, dealt with, fixed, or they will destroy a community or a part of a community. On the upside, the KDE community created the Community Working Group (CWG), and empowered us to do our job properly. On the downside, often people hide their misgivings, their irritations, their fears, until they explode. Not only does such an explosion shock the people surrounding the damage, but it shocks the person exploding as well. And afterwards, the most we can do is often damage control, rather than helping the team grow healthier, and find more creative ways to deal with those downsides.

Another danger is that even the smartest people can form an ineffective team if they are mis-matched. Focus on how a team is performing, not on the talents of the individuals within it.... Getting the right people and the right chemistry is more important than getting the right idea.[p. 74] One of the important strengths of FOSS teams, and KDE teams in particular, is that people feel free to come and go. If anyone feels walled out, or trapped in, we need to remove those barriers. When people are working with those who feed their energy and they in turn can pass it along. When the current stops flowing, it's time to do something different. Of course this prevents burnout, but more important, it keeps teams feeling alive, energetic, and fun. Find, develop, and support good people, and they will find, develop, and own good ideas."[p. 76] I think we instinctively know in KDE that good ideas are common. What is unusual is someone else stepping up to make those "good ideas" we are often given, to make them happen. Instead, the great stuff happens when someone has an itch, and decides to scratch it, and draws others to help her make that vision become reality. 

The final idea I want to present in this post is directed to all the leaders in KDE. This doesn't mean just the board of the e.V., by the way. The leaders in KDE are those who have volunteered to maintain packages, mentor students, moderate the mail lists and forums, become channel ops in IRC, write the promo articles, release notes and announcements, do the artwork, write the documentation, keeps the wikis accurate, helpful and free of spam, organize sprints and other meetings such as Akademy, translate our docs and internationalize our software, design and build-in accessibility, staff booths, and many other responsibilities such as serving on working groups and other committees. This is a shared responsibility we carry to one another, and what keeps our community healthy.

It is management's job to take the long view, to intervene and protect our people from their willingness to pursue excellence at all costs. Not to do so would be irresponsible.... If we are in this for the long haul, we have to take care of ourselves, support healthy habits and encourage our employees to have fullfilling lives outside of work. [p. 77] This is the major task of the e.V. and especially the Board, in my opinion, and of course the task of the CWG as well. 

Isn't this stuff great!? I'll be writing more blog posts inspired by this book as I get further into it. 
Categories: LinuxChix bloggers

Using strace to find configuration file locations

Akkana Peck - Tue, 2014-09-02 19:06

I was using strace to figure out how to set up a program, lftp, and a friend commented that he didn't know how to use it and would like to learn. I don't use strace often, but when I do, it's indispensible -- and it's easy to use. So here's a little tutorial.

My problem, in this case, was that I needed to find out what configuration file I needed to modify in order to set up an alias in lftp. The lftp man page tells you how to define an alias, but doesn't tell you how to save it for future sessions; apparently you have to edit the configuration file yourself.

But where? The man page suggested a couple of possible config file locations -- ~/.lftprc and ~/.config/lftp/rc -- but neither of those existed. I wanted to use the one that already existed. I had already set up bookmarks in lftp and it remembered them, so it must have a config file already, somewhere. I wanted to find that file and use it.

So the question was, what files does lftp read when it starts up? strace lets you snoop on a program and see what it's doing.

strace shows you all system calls being used by a program. What's a system call? Well, it's anything in section 2 of the Unix manual. You can get a complete list by typing: man 2 syscalls (you may have to install developer man pages first -- on Debian that's the manpages-dev package). But the important thing is that most file access calls -- open, read, chmod, rename, unlink (that's how you remove a file), and so on -- are system calls.

You can run a program under strace directly: $ strace lftp sitename Interrupt it with Ctrl-C when you've seen what you need to see. Pruning the output

And of course, you'll see tons of crap you're not interested in, like rt_sigaction(SIGTTOU) and fcntl64(0, F_GETFL). So let's get rid of that first. The easiest way is to use grep. Let's say I want to know every file that lftp opens. I can do it like this: $ strace lftp sitename |& grep open

I have to use |& instead of just | because strace prints its output on stderr instead of stdout.

That's pretty useful, but it's still too much. I really don't care to know about strace opening a bazillion files in /usr/share/locale/en_US/LC_MESSAGES, or libraries like /usr/lib/i386-linux-gnu/libp11-kit.so.0.

In this case, I'm looking for config files, so I really only want to know which files it opens in my home directory. Like this: $ strace lftp sitename |& grep 'open.*/home/akkana'

In other words, show me just the lines that have either the word "open" or "read" followed later by the string "/home/akkana". Digression: grep pipelines

Now, you might think that you could use a simpler pipeline with two greps: $ strace lftp sitename |& grep open | grep /home/akkana

But that doesn't work -- nothing prints out. Why? Because grep, under certain circumstances that aren't clear to me, buffers its output, so in some cases when you pipe grep | grep, the second grep will wait until it has collected quite a lot of output before it prints anything. (This comes up a lot with tail -f as well.) You can avoid that with $ strace lftp sitename |& grep --line-buffered open | grep /home/akkana but that's too much to type, if you ask me. Back to that strace | grep

Okay, whichever way you grep for open and your home directory, it gives: open("/home/akkana/.local/share/lftp/bookmarks", O_RDONLY|O_LARGEFILE) = 5 open("/home/akkana/.netrc", O_RDONLY|O_LARGEFILE) = -1 ENOENT (No such file or directory) open("/home/akkana/.local/share/lftp/rl_history", O_RDONLY|O_LARGEFILE) = 5 open("/home/akkana/.inputrc", O_RDONLY|O_LARGEFILE) = 5 Now we're getting somewhere! The file where it's getting its bookmarks is ~/.local/share/lftp/bookmarks -- and I probably can't use that to set my alias.

But wait, why doesn't it show lftp trying to open those other config files? Using script to save the output

At this point, you might be sick of running those grep pipelines over and over. Most of the time, when I run strace, instead of piping it through grep I run it under script to save the whole output.

script is one of those poorly named, ungoogleable commands, but it's incredibly useful. It runs a subshell and saves everything that appears in that subshell, both what you type and all the output, in a file.

Start script, then run lftp inside it: $ script /tmp/lftp.strace Script started on Tue 26 Aug 2014 12:58:30 PM MDT $ strace lftp sitename

After the flood of output stops, I type Ctrl-D or Ctrl-C to exit lftp, then another Ctrl-D to exit the subshell script is using. Now all the strace output was in /tmp/lftp.strace and I can grep in it, view it in an editor or anything I want.

So, what files is it looking for in my home directory and why don't they show up as open attemps? $ grep /home/akkana /tmp/lftp.strace

Ah, there it is! A bunch of lines like this: access("/home/akkana/.lftprc", R_OK) = -1 ENOENT (No such file or directory) stat64("/home/akkana/.lftp", 0xbff821a0) = -1 ENOENT (No such file or directory) mkdir("/home/akkana/.config", 0755) = -1 EEXIST (File exists) mkdir("/home/akkana/.config/lftp", 0755) = -1 EEXIST (File exists) access("/home/akkana/.config/lftp/rc", R_OK) = 0

So I should have looked for access and stat as well as open. Now I have the list of files it's looking for. And, curiously, it creates ~/.config/lftp if it doesn't exist already, even though it's not going to write anything there.

So I created ~/.config/lftp/rc and put my alias there. Worked fine. And I was able to edit my bookmark in ~/.local/share/lftp/bookmarks later when I had a need for that. All thanks to strace.

Categories: LinuxChix bloggers

CI, Validation and more at DebConf14

Elizabeth Krumbach - Mon, 2014-09-01 18:54

I’ve been a Debian user since 2002 and got my first package into Debian in 2006. Though I continued to maintain a couple packages through the years, my open source interests (and career) have expanded significantly so that I now spend much more time with Ubuntu and OpenStack than anything else. Still, I do still host Bay Area Debian events in San Francisco and when I learned that DebConf14 would only be quick plane flight away from home I was eager for the opportunity to attend.

Given my other obligations, I decided to come in halfway through the conference, arriving Wednesday evening. Thursday was particularly interesting to me because they were doing most of the Debian Validation & CI discussions then. Given my day job on the OpenStack Infrastructure team, it seemed to be a great place to meet other folks who are interested in CI and see where our team could support Debian’s initiatives.

First up was the Validation and Continuous Integration BoF led by Neil Williams.

It was interesting to learn the current validation methods being used in Debian, including:

From there talk moved into what kinds of integration tests people wanted, where various ideas were covered, including package sets (collections of related packages) and how to inject “dirty” data into systems to test in more real world like situations. Someone also mentioned doing tests on more real systems rather than in chrooted environments.

Discussion touched upon having a Gerrit-like workflow that had packages submitted for review and testing prior to landing in the archive. This led to my having some interesting conversations with the drivers of Gerrit efforts in Debian after the session (nice to meet you, mika!). There was also discussion about notification to developers when their packages run afoul of the testing infrastructure, either themselves or as part of a dependency chain (who wants notifications? how to make them useful and not overwhelming?).

I’ve uploaded the gobby notes from the session here: validation-bof and the video of the session is available on the meetings-archive.

Next up on the schedule was debci and the Debian Continuous Integration project presented by Antonio Terceiro. He gave a tour of the Debian Continuous Integration system and talked about how packages can take advantage of the system by having their own test suites. He also discussed some about the current architecture for handling tests and optimizations they want to make in the future. Documentation for debci can be found here: ci.debian.net/doc/. Video of the session is also available on the meetings-archive.

The final CI talk I went to of the day was Automated Validation in Debian using LAVA where Neil Williams gave a tour of the expanded LAVA (Linaro Automated Validation Architecture). I heard about it back when it was a more simple ARM-only testing infrastructure, but it’s grown beyond that to now test distribution kernel images, package combinations and installer images and has been encouraging folks to write tests. He also talked about some of the work they’re doing to bring along LAVA demo stations to conferences, nice! Slides from this talk are available on the debconf annex site, here: http://annex.debconf.org/debconf-share/debconf14/slides/lava/

On Friday I also bumped into a testing-related talk by Paul Wise during a series of Live Demos, he showed off check-all-the-things which runs a pile of tools against your project to check… all the things, detecting what it needs to do automatically. Check out the README for rationale, and for a taste of things it checks and future plans, have a peek at some of the data files, like this one.

It’s really exciting to see more effort being spent on testing in Debian, and open source projects in general. This has long been the space of companies doing private, internal testing of open source products they use and reporting results back to projects in the form of patches and bug reports. Having the projects themselves provide QA is a huge step for the maturity of open source, and I believe will lead to even more success for projects as we move into the future.

The rest of DebConf for me was following my more personal interests in Debian. I also have to admit that my lack of involvement lately made me feel like a bit of an outsider and I’m quite shy anyway, so I was thankful to know a few Debian folks who I could hang out with and join for meals.

On Thursday evening I attended A glimpse into a systemd future by Josh Triplett. I haven’t really been keeping up with systemd news or features, so I learned a lot. I have to say, it would be great to see things like session management, screen brightness and other user settings be controlled by something lower level than the desktop environment. Friday I attended Thomas Goirand’s OpenStack update & packaging experience sharing. I’ve been loosely tracking this, but it was good to learn that Jessie will come with Icehouse and that install docs exist for Wheezy (here).

I also attended Outsourcing your webapp maintenance to Debian with Francois Marier. The rationale for his talk was that one should build their application with the mature versions of web frameworks included with Debian in mind, making it so you don’t have the burden of, say, managing Django along with your Django-based app, since Debian handles that. I continue to have mixed feelings when it comes to webapps in the main Debian repository, while some developers who are interested in reducing maintenance burden are ok with using older versions shipped with Debian, most developers I’ve worked with are very much not in this camp and I’m better off trying to support what they want than fighting with them about versions. Then it was off to Docker + Debian = ♥ with Paul Tagliamonte where he talked about some of his best practices for using Docker on Debian and ideas for leveraging it more in development (having multiple versions of services running on one host, exporting docker images to help with replication of tests and development environments).

Friday night Linus Torvalds joined us for a Q&A session. As someone who has put a lot of work into making friendly environments for new open source contributors, I can’t say I’m thrilled with his abrasive conduct in the Linux kernel project. I do worry that he sets a tone that impressionable kernel hackers then go on to emulate, perpetuating the caustic environment that spills out beyond just the kernel, but he has no interest in changing. That aside, it was interesting to hear him talk about other aspects of his work, his thoughts on systemd, a rant about compiling against specific libraries for every distro and versions (companies won’t do it, they’ll just ship their own statically linked ones) and his continued comments in support of Google Chrome.

DebConf wrapped up on Sunday. I spent the morning in one of the HackLabs catching up on some work, and at 1:30 headed up to the Plenary room for the last few talks of the event, starting with a series of lightning talks. A few of the talks stood out for me, including Geoffrey Thomas’ talk on being a bit of an outsider at DebConf and how difficult it is to be running a non-Debian/Linux system at the event. I’ve long been disappointed when people bring along their proprietary OSes to Linux events, but he made good points about people being able to contribute without fully “buying in” to having free software everywhere, including their laptop. He’s right. Margarita Manterola shared some stats from the Mini-DebConf Barcelona which focused on having only female speakers, it was great to hear such positive statistics, particularly since DebConf14 itself had a pretty poor ratio, there were several talks I attended (particularly around CI) where I was the only woman in the room. It was also interesting to learn about safe-rm to save us from ourselves and non-free.org to help make a distinction between what is Debian and what is not.

There was also a great talk by Vagrant Cascadian about his work on packages that he saw needed help but he didn’t necessarily know everything about, and encouraged others to take the same leap to work on things that may be outside their comfort zone. To help he listed several resources people could use to find work in Debian:

Next up for the afternoon was the Bits from the Release Team where they fleshed out what the next few months leading up to the freeze would look like and sharing the Jessie Freeze Policy.

DebConf wrapped up with a thank you to the volunteers (thank you!) and peek at the next DebConf, to be held in Heidelberg, Germany the 15th-22nd of August 2015.

Then it was off to the airport for me!

The rest of my photos from DebConf14 here: https://www.flickr.com/photos/pleia2/sets/72157646626186269/

Categories: LinuxChix bloggers

Ask Erica: “Why Did You Decide to Coach Full-Time?”

Erica Douglass - Mon, 2014-09-01 10:23

In mid-July, I announced my new direction: coaching successful entrepreneurs full-time.

By far, the most popular question I’ve gotten is “Why did you decide to coach full-time?”

The answer, while simple, took me a while to get to. If you’re feeling stuck on your current path, keep reading, as the conclusion I came to (and how I figured it out) may help you, as well!

The Question That Changed Everything

The most motivating factor in my new career path as a CEO coach is contained within my answer to this question: “What’s the most exciting moment you’ve ever had?”

When I opened up and looked back honestly, my most exciting moments were watching people go through breakthroughs, like the story I told about my dinner with Ramit Sethi in my July post. Watching people right in front of me have a powerful emotional transformation–those were the moments I remembered, the moments I lived for as a person.

There were other signs, too. When I first met Jason Seats at Techstars, for instance, I had no intention of going through the Techstars program as a founder (though now I’m very glad I said yes to that and went through the program!) I wanted to be a mentor. I wanted to help other founders.

Then, there was the fact that I just couldn’t seem to stop helping people. Even when I didn’t feel like I had much time, I would always drop what I was doing to help someone out. Especially the rising stars like Ramit–I knew these folks were going places, and it was always so fun to help them get where they were going.

Digging Deep Into Myself

To go from “I like helping people” to coaching full-time, though–that was a transformation! Through the past three months, as I went through the sale of my business, took over a month off, and then spent another 5 weeks in solidarity with the question “What do I really want to do most?”, I dug deep to find out how I really wanted to make an impact on the world.

I’m here to coach because I have been through it all. In the past 13 years, I’ve run several bootstrapped companies and one funded company. In all, I’ve sold three technology companies. I have personally made over $3 million online–all of that being sales from companies I’ve created from nothing.

I’ve hired some of the greatest people I’ve ever had the pleasure of knowing–and hired the worst people I’ve ever dealt with! (All of which I take full responsibility for.) I’ve worked every side of a business from software development/programming to hardware to operations to management to technical support.

Whatever crisis founders of six-figure and seven-figure-a-year businesses are going through–I’ve pretty much been there. I’m not perfect, and as a seasoned entrepreneur, I know you aren’t either. That’s why I became a coach–to help you work through whatever barriers come up as you make the transformation from successful business owner to the huge next level that is waiting for you.

Struggling to Find a Mentor

When I get interviewed by the media, interviewers always love to ask: “Who is your mentor?” I’ve often struggled with that question.

Nearly 10 years ago, I desperately asked on Web Hosting Talk if anyone else knew another woman running a web hosting company that was about to hit 7 figures in annual revenue–and there was silence.

I couldn’t believe I was the only one–but back then there were probably only a handful of female founders of web hosting companies at all, let alone ones making 7 figures a year. I was a pioneer, and for that I was grateful–but it was a searing, raw, emotional experience to feel all alone in that role. I will never forget that experience–calling out, “Where is my mentor?” and hearing only silence.

Today, of course, there are many people who add “Mentor” or “Coach” to their resumes or LinkedIn profiles. But there are scarce few who have actually made millions of dollars online running real businesses–who are now coaching. I know, because I have looked for them! I have begged to be coached by people who are where I want to be, and the answer is often: “No.”

Why Most Successful Entrepreneurs Don’t Coach

Why? Successful entrepreneurs will tell you the answer: There is no leverage in coaching. They don’t have time for it. They are busy running successful companies.

I can tell you, truthfully, that that was the biggest block I had to get over as a coach. I knew that to be a successful coach, I’d have to commit to it full-time. That commitment meant I would not be focused on growing a scalable business (at least for the time being.)

I had a lot of fear around that. Was I basically tying bricks to my feet by creating a business where my income didn’t scale with more products sold?

To really get over the fear, I had to go back to what fulfilled me the most. Did I want to have a little impact on a lot of people (for instance, by writing a book) or did I want to have a huge impact on a few people?

I remembered the feeling I got when I saw people transform and their barriers break down. I decided I wanted more of that–and that was the answer for me. Coach full-time. Make the commitment. Enable the transformations to happen.

Making the commitment was scary for me, but I did it publicly so I couldn’t turn back. And it’s paid off–now, not only do I have amazing paying clients, but I’m excited every day to get up and start working. My coaching calls are transformative for my clients–and they are also transformative for me. In that way, I consider myself deeply blessed.

Who Are Your Customers?

My other huge fear came from the coaching I’ve done in the past. Previously, I’d worked with entrepreneurs who were just getting started, and I hadn’t charged much for coaching. They couldn’t afford it, and I didn’t have the self-confidence at that time to charge more.

Some clients went far. But with others, I’d spend an entire hour 1:1 and we’d never be able to dig deep into their real problems–because we were too busy grappling with “What idea should I work on?” or “How can I get this WordPress theme set up?” It wasn’t fulfilling for me or for my clients!

It was with this concern in mind that I read The Prosperous Coach, a fantastic book written by my friend and successful coach Rich Litvin. In the book, he described clearly the clients he was going after–successful, high-powered women.

While reading his book, I had a complete epiphany. It’s one of those epiphanies that seem so obvious afterward. It went something like, “OH! I can define who I want as a client!”

Becoming a Better Coach by Defining Who I Want as a Customer

I don’t know why that hadn’t occurred to me in the context of coaching before. In the marketing world, defining your customer avatar is an integral part of setting up a marketing plan. But I hadn’t thought to apply that concept here.

I thought deeply about who I’d had the best results with, and a pattern quickly emerged, with Ramit circa 2008 being my defining avatar. Someone whom I know is going to be successful. They’ve already set up a website. They have a product with customers. They’re making well into 6 figures or 7 figures and they’re facing a huge pivot point in their business–do I sell the company? Who do I hire to help me out? How do I raise my next round of capital?

That’s the point at which hiring a coach delivers huge results, and where my expertise becomes most valuable. Selling your company? I’d be happy to help you navigate those tricky waters; I’ve sold three. Raising a round of funding? I’ve seen hundreds of pitch decks, raised $640,000 for my own company, and won a pitch competition. Hiring or firing the right person? We could spend hours on that alone!

Those are the inflection points where having someone to talk to who’s been there are most critical in your business. How much equity do you give your new COO to make sure he or she sticks around but you’re not “giving up the house”? What’s the best process to find a buyer for your company? Which investors should you talk to and how much should you raise (or should you even raise at all)? Or: You’re working 70 hours a week and you feel like you’re drowning, but you don’t know who to hire or where to outsource first. These are all scenarios for which the answer may mean a 7-figure swing in your business either way. And it’s those areas where I deliver the most impact as a coach.

Why Only Four People?

In my July blog post, I mentioned I would be taking 4 clients. Why only 4 clients? (Another popular question!) I always smile when I give the answer: Because, for the first time in my life, I’m undercommitting myself so I can serve those 4 people with my full attention. With only 4 clients, I have time to look over paying clients’ pitch decks, make intros, and help guide them through selling and/or financing companies. I doubt it will surprise you to read: It’s been the best decision I’ve made so far!

I currently have 30 applications in, and I’ve already filled 2 of my 4 available slots with paying clients. I’m continuing to do coaching with applicants over the next few weeks, and I expect the other 2 slots will fill quickly. If you meet the criteria (6-figure or 7-figure business at a pivot point; looking for your next steps) and would like to be considered, please apply here.

Going full-time into coaching was a gutsy move, and an unexpected one. But, in a way I haven’t felt in a long time, it feels right. I got off my coaching call recently with a new client and told my roommate, “I can’t believe I get paid to do this!”

I help create miracles in successful entrepreneurs’ lives, and at this time, there’s nothing I’d rather be doing.

Copyright © 2008
This feed is for personal, non-commercial use only.
The use of this feed on other websites breaches copyright. If this content is not in your news reader, it makes the page you are viewing an infringement of the copyright. (Digital Fingerprint:
ca01ca7aefbdcac4b8bbfff1994a3b42)

The post Ask Erica: “Why Did You Decide to Coach Full-Time?” appeared first on Starting Your Own Business with Erica Douglass.

Categories: LinuxChix bloggers

Steven Pinker's The Blank Slate: The Modern Denial of Human Nature

Valorie Zimmerman 2 - Mon, 2014-09-01 08:03
Steven Pinker's The Blank Slate: The Modern Denial of Human Nature

Interesting, engaging, and sometimes challenging. My only criticism of the book is that he dwells a bit on fads in academia which are fading, but since he's been extensively challenged by that crowd, I suppose it is forgivable.

I'll quote extensively from the last chapter, but first, Emily Dickinson (quoted in that final chapter):
The Brain--is wider than the Sky--
For--put them side to side--
The one the other will contain
With ease--and you--beside-- 
The Brain is deeper than the sea--
For--hold them--Blue to Blue--
The one the other will absorb--
As Sponges--Buckets--do-- 
The Brain is just the weight of God--
For--Heft them--Pound for Pound--
And they will differ--if they do--
As Syllable from Sound--
And the beginning of the final chapter:
The Blank Slate was an attractive vision. It promised to make racism, sexism, and class prejudice factually untenable. It appeared to be a bulwark against the kind of thinking that led to ethnic genocide. It aimed to prevent people from slipping into a premature fatalism about preventable social ills. It put the spotlight on the treatment of children, indigenous peoples, and the underclass. The Blank Slate thus became part of secular faith and appeared to constitute the common decency of our age.  
But the Blank Slate had, and has, a dark side. The vacuum that was posited in human nature was eagerly filled by totalitarian regimes, and it did nothing to prevent their genocides. It perverts education, child-rearing, and the arts into forms of social engineering. It torments mothers who work outside the home and parents whose children did not turn out as they would have liked. It threatens to outlaw biomedical research that could alleviate human suffering. Its corollary, the Noble Savage, invites contempt for the principles of democracy and of "a government of laws not of men." It blinds us to our cognitive and moral shortcomings. And in matters of policy it has elevated sappy dogmas above the search for workable solutions. 
The Blank Slate is not some ideal that we should all hope and pray is true. No, it is anti-life, anti-human theoretical abstraction that denies our common humanity, our inherent interests, and our individual preferences. Though it has pretensions of celebrating our potential, it does the opposite, because our potential comes from the combinatorial interplay of wonderfully complex faculties, not from the passive blankness of an empty tablet. 
Regardless of its good and bad effects, the Blank Slate is an empirical hypothesis about the functioning of the brain and must be evaluated in terms of whether or not it is true. The modern sciences of mind, brain, genes, and evolution are increasingly showing that it is not true. The result is a rearguard effort to salvage the Blank Slate by disfiguring science and intellectual life: denying the possibility of objectivity and truth, dumbing down issues into dichotomies, replacing facts and logic with intellectual posturing. 
The Blank Slate became so deeply entrenched in intellectual life that the prospect of doing without it can be deeply unsettling. ...Is science leading to a place where prejudice is right, where children may be neglected, where Machiavellianism is accepted, where inequality and violence are met with resignation, where people are treated like machines? 
Not at all! By unhandcuffing widely shared values from moribund factual dogmas, the rationale for these values can only become clearer. We understand *why* we condemn prejudice, cruelty to children, and violence against women, and can focus our efforts on how to implement the goals we value most. ... 
... Acknowledging human nature does not mean overturning our personal world views... It means only taking intellectual life out of its parallel universe and reuniting it with science and, when it is borne out by science, by common sense.
This book was published in 2002, and I think Pinker and his fellow scientists who investigate human nature are beginning to make headway. This book was a good reminder of some of the nonsense we are now sweeping into the dustbin of history, and new understanding of human nature now coming to light.
Categories: LinuxChix bloggers

Bash Arrays

Renata - Wed, 2014-08-27 17:47

Arrays are helpful, and I’ll give some examples for reference. They can be a little bit confusing, but once you get used to them, it’s easy!

First you initialize the arrays

cat[1]="Bub"
cat[2]="Grumpy"
cat[3]="Luna"

feat[1]="cute"
feat[2]="terrible"
feat[3]="fashion"

Then you use them as you wish. You can, at first, just list them individually

echo "${cat[3]} is ${feat[1]}"

or list all of the items in a specific array
echo “Cats I like: ${cat[@]}”

Something like that would also work:

for i in {1..3}
do
echo "${cat[i]} is ${feat[i]}!"
done

That opens many possibilities. Life is not only about internet cats (although it sometimes seems so).

Make good use of your arrays, they’re great!

(I takes me 8 months to update the site and I write a silly post about bash arrays, I know. Sorry, I was thinking about them.)

Categories: LinuxChix bloggers

OpenStack Infrastructure August 2014 Bug Day

Elizabeth Krumbach - Wed, 2014-08-27 00:08

The OpenStack Infrastructure team has a pretty big bug collection.

1855 collection
Well, not literal bugs

We’ve slowly been moving new bugs for some projects over to StoryBoard in order to kick the tires on that new system, but today we focused back on our Launchpad Bugs to par down our list.

Interested in running a bug day? The steps we have for running a bug day can be a bit tedious, but it’s not hard, here’s the rundown:

  1. I create our etherpad: cibugreview-august2014 (see etherpad from past bug days on the wiki at: InfraTeam#Bugs)
  2. I run my simple infra_bugday.py script and populate the etherpad.
  3. Grab the bug stats from launchpad and copy them into the pad so we (hopefully) have inspiring statistics at the end of the day.
  4. Then comes the real work. I open up the old etherpad and go through all the bugs, copying over comments from the old etherpad where applicable and making my own comments as necessary about obvious updates I see (and updating my own bugs).
  5. Let the rest of the team dive in on the etherpad and bugs!

Throughout the day we chat in #openstack-infra about bug statuses, whether we should continue pursuing certain strategies outlined in bugs, reaching out to folks who have outstanding bugs in the tracker that we’d like to see movement on but haven’t in a while. Plus, we get to triage a whole pile of New bugs (thanks Clark) and close others we may have lost track of (thanks everyone).

As we wrap up, here are the stats from today:

Starting bug day count: 270

31 New bugs
39 In-progress bugs
6 Critical bugs
15 High importance bugs
8 Incomplete bugs

Ending bug day count: 233

0 New bugs
37 In-progress bugs
3 Critical bugs
10 High importance bugs
14 Incomplete bugs

Full disclosure, 4 of the bugs we “closed” were actually moved to the Zuul project on Launchpad so we can import them into StoryBoard at a later date. The rest were legitimate though!

It was a busy day, thanks to everyone who participated.

Categories: LinuxChix bloggers
Syndicate content