GIT shortcuts
$ git config --global alias.co checkout
$ git config --global alias.br branch
$ git config --global alias.ci commit
$ git config --global alias.st status
$ git config --global alias.unstage 'reset HEAD --'
$ git config --global alias.last 'log -1 HEAD'
push a local commit to server
$ git push origin master:remote_branch_name
check out a remote branch to local and start work on it, and have this local copy 'linked' to the remote branch
$ git checkout --track origin/gradle
diff a file between two commits
$ git diff -w 9655d51..c48be72 -- src/uniflow/plate.java
Revert a local uncommited change to its original content as in last commit
$ git checkout -- file.java // same as svn revert
Undo the last commit (change the files to the previous commit state, and make a new commit)
$ git revert HEAD
Unstage a file (keep the working file content unchanged, including these unstaged changes)
$ git reset file
Remove two local commits (last two commits will be removed, don't do this to published commits)
$ git reset --hard HEAD~2
$ git reset --hard // obliterates all uncommitted changes.
Modify previous commit (e.g. commit additional files) or commit message
$ git commit --amend
Reset local branch to match exactly the remote branch
$ git checkout develop // select the target branch to reset
$ git fetch origin
$ git reset --hard origin/develop
remove untracked files
$ git clean -n // dry run, just list files to be removed
$ git clean -f // force remove untracked files
$ git clean -df // force remove both untracked files and directories
Merge multiple local commits into one commit
$ git rebase -i origin/master
pick 16b5fcc Code in, tests not passing
squash c964dea Getting closer
squash 06cf8ee Something changed
squash 396b4a3 Tests pass
squash 9be7fdb Better comments
squash 7dba9cb All done
Note: change all 'pick' to 'squash' except for the first line in the pop up editor
merge multiple commits locally
$ git rebase -i
merge last two commits
$ git rebase -i HEAD~2
Create a new local branch
$ git checkout -b experimental_tilde
Abort rebase
$ git rebase --abort
Delete a local branch
$ git branch -d experimental_tilde
Delete a remote branch
$ git branch -d -r origin/brach_name
Daily synch from remote master to local feature branch
# assume you are on local feature branch
# stash the local uncommited changes
$ git stash
# pull in latest changes from master
$ git co master
$ git pull
# switch to your feature branch
$ git co myfeature
# rebase the feature branch on top of master, alternatively one could merge (see https://stackoverflow.com/questions/13006135/copy-last-commits-from-master-to-branch)
$ git rebase master
# reapply your previously saved local changes.
$ git stash pop
Add feature branch changes to master thru rebase. Simple version, see next for more options
$ git checkout feature
$ git checkout -b temporary-branch # create temp branch for rebaseing
$ git rebase -i master # [Clean up the history]
$ git checkout master
$ git merge temporary-branch # put the changes back to master.
$ git push
Rebase feature branch onto master (https://randyfay.com/content/rebase-workflow-git)
Assume you have a master branch for shared work, and a personal branch called feature.
$ git checkout feature # feature branch to do local work
$ # do some local commits on this branch.
$ git fetch origin # pull changes from remote
$ git rebase -i master # plop our branch changes onto everyone else’s. There might be conflicts, need to resolve and resume the rebase using git rebase --continue
# now our branch has the latest ideal code with all the local and remote changes. Need to publish it. We will switch to master and make it the same as our rebased branch.
$ git checkout master # switch to master branch
# there are a few ways to push changes back to remote
1. $ git rebase feature # now master is fast-forwarded to have feature commits on it
2. $ git merge feature # same effect as option 1 rebase
3. $ git merge --squash feature # squash will merge all branch commits into one onto the master develop branch
4. $ git merge --no-ff feature # explicitly document the merge by adding a merge commit
$ git push origin master # push the master change back to the remote
Find out number of file changes between two tags, exclude type-change items
$ git diff v6.1.0001..v6.1.0003 --name-status --diff-filter=t | wc -l
Find the commits between two tags
$ git log v6.1.0001..v6.1.0003 --pretty=oneline
Move the last commit to a new feature branch
# so you have been working on master branch although on a feature, and committed some thing locally, however it really belongs to a feature branch, and needs more work. Here is a way to create that new branch with the commit but back out the commit from the master branch.
# assume you are currently on the master branch
$ git checkout -b new_branch # make a new branch containing the last commit
$ git checkout master
$ git reset —hard HEAD~1 # move the master head back one commit, now that commit only lives in the new branch created.
$ git checkout new_branch # move to the new branch to continue working on it.
Git password free
$ cd
$ git config credential.helper store
$ git fetch
Tuesday, June 12, 2018
GIT cheat sheet
Wednesday, May 30, 2018
Elixir load module file
iex> import_file("path/to/script_file.exs")
Thanks to this blog post!
Thursday, January 19, 2017
Customize User Authentication in Grails
In a Grails project, I need to implement an additional authentication mechanism - username plus a short-lived authentication token that the app knows how to decode and verify. The route allows (a 2nd app) login on behalf of a user, without knowing his/her password, as long as there is pre-agreement on secure token generation (This might sound strange, but that's another separate topic, think poorman's single sign on).
Spring Security is complicated, and Grails adds another layer of opacity to it. For this to happen in Grails, we need to
- create a AuthenticationProvider that defines the customized authentication logic.
- create a AuthenticationProcessingFilter that defines in what condition such authentication should be activated.
Here are the details:
-
In order to have your own authentication, you need a AuthenticationProvider
public class AppUserAuthenticationProvider extends DaoAuthenticationProvider { @Override public boolean supports(Class authentication) { return (AppUserAuthenticationToken.class.isAssignableFrom(authentication)); } @Override protected void additionalAuthenticationChecks(UserDetails userDetails, UsernamePasswordAuthenticationToken authentication) throws AuthenticationException { String name = authentication.getName() String key = (String) authentication.getCredentials() if (key == null) { logger.debug("App Authentication failed: no credentials provided"); throw new BadCredentialsException(messages.getMessage( "AbstractUserDetailsAuthenticationProvider.badCredentials", "Bad credentials"), userDetails); } // customized logic to validate the username and authentication token if (! AppAuthConnector.validateAppKey(name, key)) { logger.debug("App Authentication failed: app_auth_key is invalid"); throw new BadCredentialsException(messages.getMessage( "AbstractUserDetailsAuthenticationProvider.badCredentials", "Bad credentials"), userDetails); } } } public class AppUserAuthenticationToken extends UsernamePasswordAuthenticationToken { public AppUserAuthenticationToken(String username, String key) { super(username, key); } public AppUserAuthenticationToken(UserDetails principal, String credentials) { super(principal, credentials, principal.getAuthorities()); } }
Note here the customized Provider did not override
public Authentication authenticate(Authentication authentication) throws AuthenticationException { }
but rather
protected void additionalAuthenticationChecks(UserDetails userDetails, UsernamePasswordAuthenticationToken authentication) throws AuthenticationException { }
The later is one of many steps inside authenticate() method, and authenticate() also includes some default checks that I want to inherit, for example preAuthenticationChecks and postAuthenticationChecks, to check if user account credential has expired, user is locked, disabled, or expired. If we simply override authenticate() without properly implementing these checks, our customized authentication will introduce loopholes, for example allowing an expired user to login from that 2nd app.
-
Register this provider as a bean so it can be used later, in file grails-app/conf/spring/resources.groovy
beans = { ... appUserAuthenticationProvider(AppUserAuthenticationProvider) { userDetailsService = ref('userDetailsService') } }
-
Register this provider bean with Spring, in file grails-app/conf/Config.groovy
grails.plugin.springsecurity.providerNames = [ 'appUserAuthenticationProvider', 'daoAuthenticationProvider', 'rememberMeAuthenticationProvider’]
-
Now the provider is set, we need to define when to activate this authentication path. In my case I want this authentication mechanism to work for any request, as long as there are two parameters 'app_username', ‘app_key’ in the URL or in request header. We need a AbstractAuthenticationProcessingFilter that can intercept the processing:
public class AppUserAuthenticationFilter extends AbstractAuthenticationProcessingFilter { public AppUserAuthenticationFilter() { // the matcher will detect the presence of URL parameters super(new AppUserAuthenticationRequestMatcher()) // what to do after the authentication is successful setAuthenticationSuccessHandler(new AppUserAuthenticationSuccessHandler()) } @Override public Authentication attemptAuthentication(HttpServletRequest req, HttpServletResponse resp) throws AuthenticationException, IOException, ServletException { String key = AppUserAuthenticationRequestMatcher.getKey(req) String username = AppUserAuthenticationRequestMatcher.getName(req) return this.getAuthenticationManager().authenticate(new AppUserAuthenticationToken(username, key)) } } class AppUserAuthenticationSuccessHandler extends SimpleUrlAuthenticationSuccessHandler { @Override public void onAuthenticationSuccess(HttpServletRequest request, HttpServletResponse response, Authentication authentication) throws ServletException, IOException { clearAuthenticationAttributes(request) request.getRequestDispatcher(request.getServletPath()).forward(request, response) } public class AppUserAuthenticationRequestMatcher implements RequestMatcher { public static final String APP_AUTH_NAME = "app_username" public static final String APP_AUTH_KEY = "app_key" public boolean matches(javax.servlet.http.HttpServletRequest request) { String uri = request.getRequestURI() if(uri != null && AppUserAuthenticationRequestMatcher.getKey(request) != null && AppUserAuthenticationRequestMatcher.getName(request) != null) { return true; } else { return false; } } public static String getRequestParam(HttpServletRequest request, String param) { String k = request.getParameter(param); if(k == null) { String hn = "X-" + param; k = request.getHeader(hn); } return k; } public static String getKey(HttpServletRequest request) { return AppUserAuthenticationRequestMatcher.getRequestParam(request, AppUserAuthenticationRequestMatcher.APP_AUTH_KEY); } public static String getName(HttpServletRequest request) { return AppUserAuthenticationRequestMatcher.getRequestParam(request, AppUserAuthenticationRequestMatcher.APP_AUTH_NAME); } }
-
Register this filter as a bean, in file grails-app/conf/spring/resources.groovy
beans = { ... appUserAuthenticationFilter(AppUserAuthenticationFilter) { sessionAuthenticationStrategy = ref('sessionAuthenticationStrategy') authenticationManager = ref('authenticationManager') authenticationFailureHandler = ref('authenticationFailureHandler') rememberMeServices = ref('rememberMeServices') authenticationDetailsSource = ref('authenticationDetailsSource') } }
-
activate the filter during the bootstrap, in grails-app/conf/BootStrap.groovy
class BootStrap { def init = { servletContext -> ... SpringSecurityUtils.clientRegisterFilter('appUserAuthenticationFilter', SecurityFilterPosition.SECURITY_CONTEXT_FILTER.order + 10) }
That should do it. Enjoy!
Even with Grails' "Convention over Configuration" paradigm, aimed at simplify the coding, there is still a high learning curve to do get in the groove of it, especially if you want to something different from the default. The layers are thick, and the magic is ever more mysterious. The root cause is the enormous amount of flexibility the framework is trying to offer - every little decision is wrapped up and ready to be swapped out.
Reference: Grails Custom AuthenticationProvider by Kali Kallin.Thursday, January 12, 2017
<? super T> and <? extends T> in Java
Today, I came across some Java code from a Java library we used, as with many framework/library code bases, it contains lots of Generics, Collections, and Type Wildcards (?). For example:
<? super T> and <? extends T>
or
public static <T extends Comparable<? super T>> void sort(List<T> list)
The syntax is complex, and it prompted me to study further the wildcard of Generic types. Thanks to Google and Stack Overflow, the online material is good.
- Difference between <? super T> and <? extends T> in Java
- What is PECS (Producer Extends Consumer Super)?
- Bounded Wildcards Increase Applicability
In my own words, which is less precise than in these articles, but maybe easier to understand:
- If you need to only read through a data structure, declare it with extends, as it can guarantee the object type, and you will be type safe when using the objects.
- If you need a data structure that allow to put objects into it, then declare it with super, as it allows more type flexibility.
This code example helps show both cases (copied from Generics FAQ):
public class Collections {
public static <T> void copy
( List<? super T> dest, List<? extends T> src) { // bounded wildcard parameterized types
for (int i=0; i<src.size(); i++)
dest.set(i,src.get(i));
}
}
As you can see, the src list is declared as extends, since you only need to read from it (get), and the dest list is declared as super, as you need to write to it (set).
Enjoy!
Friday, December 30, 2016
Disconnect a PHP PDO connection to RDBMS
http://php.net/manual/en/pdo.connections.php
Inside which it will tell you to set the variable that holds the PDO connection object to null. However the catch is you also need to set all other objects which might hold reference to the PDO object to null as well, and it can be cumbersome as expressed in the comments section of that page. Here is an example:
$dbh = new PDO('mysql:host=localhost;dbname=test', $user, $pass);
// use the connection here, note the side effect of
// a new $sth reference to the PDO
$sth = $dbh->query('SELECT * FROM foo');
// and now we're done; close both $sth and $dbh !
$sth = null; // this is not obvious
$dbh = null;
Enjoy!
Monday, December 12, 2016
dos2unix
The text file I got from a colleague is full of ^M and crashes my parser running on Mac and Linux. This is due to the very unfortunate colleague is still stuck with a Windows machine, and the line break is coded differently on these systems.
I tried to download the dos2unix util for Mac and installed it smoothly, however it does not work for some reason, the ^M's are still there after running it. Bummer! Short of time, I have to roll my own solution. It is yet another one-liner:
$ cat dos2unix
#!/bin/bash
sed -i 's/^M/\n/g' $1
However the main challenge is to type in ^M character correctly inside the script. It is NOT a literal ^ and a literal M, but rather a single character.
- In vi, you type: ctrl-v then ctrl-m
- In emacs, you type: ctrl-q then ctrl-m
Enjoy!
[Update 2016-12-21]:
Coming back to the problem, upon further inspection on the problematic file, using
$ od -c filename
It turns out the line break is only represented as \r, and I guess that's why the regular dos2unix does not work, which expects \r\n. Knowing this, another fix come to mind (without having to deal with typing ^M character):
$ sed -i -e 's/\r/\n/g' filename
Also my docker infested colleague suggested another general solution for running standard dos2unix when it works, without having to download and install the dos2unix in the hosting environment:
$ docker run --rm -it -v `pwd`:/data/ alpine dos2unix /data/filename
This assumes the filename is in the `pwd`, and you have docker installed on the hosting environment. This basically launch a minimal docker machine and use the dos2unix that comes with it. However it won't solve my specific file with non traditional line break.
Enjoy!Find out the Perl modules installed and their versions
$ whichpm.sh module::name perl_bin_path
It prints out all the Perl modules accessible by the particular Perl interpreter, and then the version of the given module.
$ cat whichpm.sh
#!/bin/bash
echo 'print map { sprintf( "%20s : %s\n", $_, $INC{$_} ) } sort keys %INC; print "\n '$1' version : $'$1'::VERSION\n\n"' | $2 "-M$1"
$ ./whichpm.sh LWP::UserAgent /usr/bin/perl
Carp.pm : /System/Library/Perl/5.18/Carp.pm
LWP.pm : /Library/Perl/5.18/LWP.pm
LWP/UserAgent.pm : /Library/Perl/5.18/LWP/UserAgent.pm
Storable.pm : /System/Library/Perl/5.18/darwin-thread-multi-2level/Storable.pm
Time/Local.pm : /System/Library/Perl/5.18/Time/Local.pm
URI.pm : /System/Library/Perl/Extras/5.18/URI.pm
strict.pm : /System/Library/Perl/5.18/strict.pm
vars.pm : /System/Library/Perl/5.18/vars.pm
warnings.pm : /System/Library/Perl/5.18/warnings.pm
LWP::UserAgent version : 6.13
The reason for the 2nd argument (perl interpreter) is that there could be multiple Perl interpreters on a system and they tend to have different version and also the supporting modules.
Of course, to make it a reusable script, you will have to add help message, input argument checking, etc.
Enjoy!