If you use Pleasant Password Server, you may have a need to request passwords from a command line or automated process. If you do, the script below should be very helpful.
It took me most of the evening to figure out how to request passwords using PowerShell and the RESTful API built into Pleasant Password Server (aka Keepass Server).
The vendor’s documentation is unfortunately very lacking. Seriously, would it kill you to include some examples? At any rate, the script below uses the Invoke-WebRequest cmdlet to access the RestfulAPI.
The key thing to note here is that the only way it seems to retrieve passwords is via their GUID. Importantly, this is not the UUID that is displayed in the desktop client.
The only way I’ve found to identify the GUID is to access the desired password using the webclient and then press F12 in your browser to activate the debugging tools.
From there if you select the “Network” tab, you should see the GUID appended to the end of the URL for your password server site.
I needed to run a PowerShell script on a few dozen machines scattered across just as many disconnected networks. I wanted to ensure that if anyone in the future attempted to make changes to the script that it would no longer execute. This means learning how to implement PowerShell certificates. After much Googling I found that there was no good end-to-end guide on implementing certificates. After much trial and error, I have figured out how to implement PowerShell certificates in such a way that you do NOT need to purchase a commercial certificate while still being able to run the script on remote systems. I figured I would share the process in the hopes that I can save the next person the frustration I had.
Disclaimer: These steps are presentedwithout any warranty, express or implied. As far as I have been able to determine, this process should drastically improve the security of your scripts without otherwise introducing any new security issues. However as I am still learning about certificates, I may have missed something. If you do find such a security concern, please let me know as I’d love to know what I missed!
Note: The commands below use the “pki” module for PowerShell 4 and therefore requires Windows 8.1 / Windows 2012
If a modern OS is not available, these same steps can be completed through a combination of legacy tools (makecert.exe and certmgr.msc)
Specific steps on completing this with a legacy OS are not covered in this document
How the Certificate Creation Script Works
Creates a custom self-signed certificate on the local machine where the script authoring takes place
The entire key (public+private) is exported for archival and safekeeping
The public key of this certificate is then exported and immediately reimported into both the Root and Trusted Publisher certificate stores on the authoring computer/user
This makes this certificate implicitly trusted on the authoring computer which makes it eligible to be used to sign a PowerShell script
The newly created certificate is then used to sign a custom PowerShell script
The public certificate is then imported onto the target/remote system where the script is intended to be executed
The target system is assumed to be running an ExecutionPolicy of “AllSigned” which requires that all scripts must be signed by an approved entity before it is executed
But what happens if you have the same error under VMware Workstation (11)? Today I had a reason to detach a secondary disk from one VM and temporarily connect it to another. When I was done, I tried to reattach the secondary disk back to it’s original VM. When I tried to reboot however I received the following error:
In the HOWTO posted above, this can be resolved by updating the CID and ParentCID fields inside of the plain text .VMDK configuration file for the VM. Unfortunately that doesn’t apply to VMware workstation because this “descriptor” / configuration data is stored in the same file as the “-flat” disk making it impossible to edit with a plaintext editor such as notepad. Instead we need to use a Hex Editor. Don’t worry, this is much simpler than it sounds.
If you’ve used PowerShell for length of time at all, I promise this HOWTO is going to be revelation and will fundamentally change how you use PowerShell.
PowerShell is full of objects. All of those objects have properties. Many of those properties have their own sub properties. Those child properties can then still have even more properties underneath them. While it is fantastic that we have all of this data at our fingertips, it is often exceptionally difficult to know what’s available. It’s a case of not knowing what you’re missing because you didn’t know about it in the first place.
To combat this, PowerShell includes an excellent command called Get-Member which shows what properties are available on an object. The problem is, it doesn’t show sub properties and nor does it show the values of those properties. This combination makes searching for available data both frustrating and annoying. I’m pleased to report I now have a solution for this problem!
I recently found myself having to learn about “JSON” for work. In a nutshell, JSON is an alternative to XML and is a text based representation of data. To work with JSON, PowerShell includes a cmdlet called ConvertTo-JSON. By complete accident I discovered that this cmdlet has a very interesting capability. If you pipe any object into it, it will spit out absolutely everything PowerShell knows about that object, nested sub properties and all.
I then did some research and discovered a free standalone tool (no installation required) called jsonview.exe from CodePlex. This tool provides a graphical tree view of JSON data. Can you see where this is going? Wouldn’t it be amazing if you had a nice graphical interface to view all of the data inside of an object, regardless of how far down it was nested?
Consider the following example. We have a cmdlet called Send-Email that isn’t working properly. When we try to use it, all we get is an error “Unable to connect to the remote server”.
The question is? What server? And why can’t it connect to it?
I am a big fan of the traditional Blackberry philosophy of building hardware and software that can get work done quickly in as few steps as possible.
With the release of OS10, Blackberry effectively had to build a new OS from scratch. This is a ton of work and as a result, many of the refinements that were added to OS7 over the years are now missing. I have read many forum posts of people requesting feature x to be added back to the platform. Blackberry has finite resources for development and testing and simply can’t implement everything all at once. As a result, these posts become effectively nothing but noise. It was clear to me that forums alone are an ineffective method for getting Blackberry the information it needs for what features to implement next.
What is required is a “single source of truth” or a curated list of all requested new features and the relative popularity of each. I have decided to take ownership of this problem and attempt to solve this at a global scale by providing a common platform for everyone to voice their requests.
I have created a new voting poll that I intend to be a living list of new feature requests for the Blackberry 10 operating system. I am formally offering myself as the official curator of these requests. You can see the first iteration of the poll below that includes a number of feature requests that I personally have after using my Classic for several days. My hope is that others will review my list and if they agree can vote on those items. Alternatively they can add their own. I will be monitoring the new submissions and if they don’t already exist or are not entirely unreasonable I will add them to the poll.
Below is the poll and the questions. Please if possible leave feedback either here or on the official poll website (available at http://poll.fm/536y9) as the feedback is required to correlate the numbers in the polls to show the relative interest in a given feature and to give this poll the weight it needs to have any hope of affecting change.
If you have any questions, comments or suggestions, please leave them in the comments as well. I want to make this a tool that the entire Blackberry community can leverage and maybe, just maybe influence the behavior of Blackberry itself!
If you would like to see new feature requests added to this poll or changes to the existing entires, please add the entry to the “other” field in the poll and then leave comments on how you feel your idea should be implemented.
If you’ve been in IT for any length of time, you’ve run into issues where you’ve been asked to delete folders that users have created that are longer than 260 characters. If you try to delete the folder, you get something like:
So you start Googling and invariably you’ll find the same tool recommended over and over again – the Long Path Tool from www.longpathtool.com. I have to give the guy that wrote it credit. He has completely saturated the search engines with his solution for a problem that by all accounts shouldn’t even exist. At any rate, you grab his tool and try to run it only to be greeted with:
I don’t know about you, but I all but refuse to buy a tool for something that should be a solved problem in 2014. I knew that robocopy was a built-in tool for every windows installation and I knew that it had support for paths up to 32,000 characters long. I thought I could use that. Unfortunately robocopy doesn’t offer any native support for delete data, only copying it as per its namesake. But with robocopy being the only native tool I was aware of that could solve the problem of deleting folders with long file names, I decided to spend some time with it.
If you think outside the box a bit, you quickly realize that robocopy can delete folders if you approach the problem from a different angle. Robocopy has a “mirroring” function that will mirror a source folder to a destination folder. If the source folder is empty, the contents of the destination folder are deleted. Ah ha!
I decided to try and wrap this idea around in an easy to use PowerShell cmdlet. I ended up falling way, way down the rabbit hole on this one as I strived to create the most comprehensive and complete cmdlet I’ve ever built. The objective was to make it robust enough that it could be mistaken for an “official” cmdlet. I’m pleased to report that I have been largely successful. If you have an need to delete a bunch of folders that contain long file names, read on to find out how to do so for free!
If you found this post via a search engine, you’ve likely received a ticket/request from some manager requesting an audit report of the permissions on an important share within your company. Unfortunately for you, this folder contains literally tens of thousands of folders and hundreds of thousands of files. Oh and since there has been no proper governance of it over the years, inheritance is broken all over the place and permissions are assigned many levels deep with no rhyme or reason. You’ve now been tasked with cleaning this up. You realize that trying to analyze this manually is simply impossible so you’re looking for some kind of tool to assist you. You’ve found tools like the NTFS Permissions Reporter (http://www.cjwdev.com/Software/NtfsReports/Info.html) but quickly found this costs hundreds of dollars in order to produce any kind of intelligible report. You’re not allowed to spend any money so you’re stumped. So now what?
I found myself recently in this exact situation and decided to use this as my first real attempt at building a full-fledged tool with PowerShell. Wait! Don’t run away yet. There is nothing to be afraid of here as I’ve designed this tool to be useful even if you have absolutely no PowerShell experience. Again, you don’t care how you get the report, you just care that it’s readable. That’s what I’m here to help you with. The tool to do that I call the ntfsreporter and it works as follows:
Accepts a parent folder (can be a local folder or a UNC path on a remote machine)
Builds a list of all files and folders including all subfolders and files along with the permissions assigned to each
Here’s where it gets interesting:
Compares the permissions on each item to that of its parent. If the permissions match, it is ignored. If the permissions don’t match, this means someone has unexpected rights so include it in the report
Has the option to easily specify a list of accounts to automatically ignore in the report. So if you have Domain Admins or some special account that has access everywhere anyway, you can easily exclude it
Has the option to include SIDs if desired for user accounts that no longer exist but still have permissions allocated (disabled by default)
Clearly identifies what permissions have been added or removed on a per file and folder basis
Does this sound like it might be helpful for you? Excellent, let’s get started.
You guys are in for a treat today. Have you ever attempted to troubleshoot an issue with an Active Directory user only to find yourself in ADSI edit trying to figure out if any attributes are not configured correctly? Since you’re often not sure what the value is supposed to be, you’d like to compare it to another known working user. That’s a pretty common problem in Windows IT and I’ve just created a tool that makes it much, much easier to do! Let me know show you how it works.
When you run this tool, a dialog box appears that looks like this:
This screen is asking for the users you’d like to display Active Directory attributes for. You can enter 1 or 2 or 10 different users if you’d like, just be sure to separate each one with a semicolon.
It can take a few moments for the AD attributes to load so a progress bar is displayed. It shouldn’t take more than 5 seconds or so per user.
You’re now presented with a grid view that contains 3 columns : the username, the attribute name and its value. That’s pretty cool, but that’s not the best part. The grid view has a built in real time search that automatically searches every single item of text means that you can do some incredibly powerful analytics.
In PowerShell, you will constantly find yourself iterating through collections of data. Perhaps you have a list of users or folders to process. Often what ends up happening is you will obtain this data through some other cmdlet. Perhaps Get-ADUser or Get-ChildItem. This works well but what happens if you’re testing and you only want to iterate through a subset of this data? Or perhaps You have a CSV or log file with a bunch of text in it that you need to iterate through? There are many ways to go about this. The most common perhaps is saving the data into a file and then using the Import-CSV cmdlet. That’s fine but it annoys me as if I’m just testing, I don’t want to create more files I’m likely just going to forget about anyway. Another option is to declare an array. This is better but you have to wrap each line in quotes. If you are copying and pasting 100 rows from notepad, you have to do extra work to add these quotes. That’s no good. To solve this once and for all, I have spent a bunch of time trying to come up with the quickest and easiest possible way of adding a block of text and converting it into an array. I’d like to share it with you.
The core feature we are going to take advantage of here is something called a here string. A here string can be thought of as a block of text where PowerShell doesn’t attempt to interfere with any special characters it may find. In our case, we’re going to take advantage of the fact that here strings do not strip out the new line (aka carriage return).
Let’s say that we have a list of processes in notepad that we want to get details on. That list look like the screenshot below. Not that there is absolutely no special formatting or additional characters, just as you’d typically find this kind of data "in the wild".
What’s the fastest possible way to get the details of these processes?
I find myself constantly in a situation where I have a giant list of data and I need to know which entries in that data occur most frequently. I’ve always used Excel pivot tables for this purpose. Recently someone saw me solve a problem using this technique and they were very surprised and wished they’d known this years ago. With that in mind, I figure I’d share my technique.
Let’s create a made-up scenario for demonstration purposes. Let’s say you want to know which DLL on your computer loaded in memory most frequently. The scenario doesn’t matter, it’s just an excuse to generate some data.
Ultimately, you’re going to end up with Excel open with a bunch of columns of data similar to this:
As you can see, we have 2,092 rows of data with the fourth column being FileName. We want to produce a list in order of frequency of which DLL is listed most often. To do that complete the following steps:
Select either the column you’re interested in or the entire sheet depending on your needs
Select Insert / Pivot Table
The range will default to what you selected. Since you selected all the data, simply press OK