The tool lets you connect to different indexes and assists you crafting search queries.
As a byproduct it generates a REST URL for you to use in your application or website.
With this it should be easier to develop against the Azure Search REST API – and should speed up the testing time enormously. When I started fiddling with the Azure Search API I was not aware what query options I have and how they play together (even tho the documentation is quite good!).
In this very early release the tool covers the following areas:
Security / Access
Provide your Azure Search Service and your API Key (currently only Admin or Secondary) work. After connecting the tool will resolve all available indexes.
The tool from now then takes care of the api-key header.
As of now, the tool will show you all available indexes. The schema / fields and types will be visible soon in the “Index” tab. If you want to change the index after you selected one, click in the menu on “Connection” and then on “Index”.
In the flyout menu you can select one of your available indexes.
In the Search section of the tool you have the option to craft queries and test them against the selected index. The tool supports you to create a query that uses the important options with an easy interface (try that with Postman – not as convenient, right?).
By now $top, $skip, $filter and api-version are implemented – the rest of the options will follow soon. Maybe I add validation and some examples, too.
Once you have changed the parameters, the URL in the top will be changed so that you can use it in your application.
You need the raw JSON data that is returned by Azure Search? You want it pretty? Click on “Raw” to get the results as it is returned by Azure Search.
You want it raw? You get it RAW!
As you can see not all options are implemented – I released it early to get early feedback. In the near future I will implement the Search view and all the nice options you have there. Then I will add Suggestions and Facets – because they rock.
Lastly I will develop a nice view where you can upload test data according to the schema of the selected index. Maybe, but right now I doubt it, I will add support of Index operations to create and update an index – but the Azure Portal does that pretty well and the code operations are not too complex – but well, lets see.
Scoring Profiles? Index Statistics – they are on the list, too.
So you read until down here? If so, I would be happy to get some feedback from you. Something does not work? Something missing?
In this article I will show you how to create SharePoint 2010/2013 Search Content Sources with a handy powershell script and why you should care.
This is my 100th blog post and therefore it has to be something with SharePoint Search – something good. I had the idea for the script on my mind for quite a while now, but there was no project and no time to create it – until now.
During my consultant work I see quite a lot of different SharePoint environments and 90% of the Search Service Applications look like this before I start working there:
What is the problem you may ask? I see quite a few. Most intranets I see are rather big and have different sections/departments/regions within their portal – with different requirements of course. Some aggregate their content via Search (aka Search driven applications) and some don’t – some upload documents, some only use SharePoint as archive – as flexible as you can use SharePoint and as different as the requirements can be you should adjust your Search Content Sources accordingly – because everything you configure there will be in the Search Index – everything else wont.
Everything you configure in the SharePoint Content Sources will be in the search index – everything else won’t!
So what is the problem with one content source?
You are not flexible – no different crawl schedules, no priorities – just one setup to cover everything. How about different settings for DEV / QA systems ? How about different crawl schedules for people search? LOB / BCS, external systems? Or people complain that some results appear too late in the search or that they are not there at all? Read on!
You can create content sources with PowerShell – that’s a good thing and enables us to automate it. So the easiest way to create a Content Source would be (except creating them by hand in the central admin) with this one-liner:
You will be asked for the name of the Search Service Application and then it creates the Content Source for you – good, simple and works. But apparently no crawl schedule. And imagine creating that for 16 different content sources with different crawl schedules and what not. And maintainable and readable it should be, too. So we need a more sophisticated script for that with reproducible results – in other words we need a xml config file and a powershell with the logic.
The config file contains every to be created/updated content source
In the 3rd line you have to specify the Search Service Application in my case this is “Search Service Application”.
In line 13, 46 and 81 I configure three different content sources – if you want to create more, then you only have to copy one block and maybe for the ease of use the comments surrounding the block starting with <ContentSource> and ending with </ContentSource>.
Every <ContentSource> block has a unique name, a type (currently tested is SharePoint and File) and a crawl behavior (CrawlSites = crawl the site collection; CrawlVirtualServers = crawl the entire host/web app)
In every <ContentSource> you can define the start addresses of the content source within the <Url> element. If you have multiple addresses, separate them with a new line or return.
Then every <ContentSource> can have up two crawl schedules. The comments above and the three provided examples should give you a good understanding how to configure the crawl schedules.
You don’t have to read the script or customize it – everything is configured in the xml config – the script just reads the xml file that must be in the same directory. The script has no parameter but must be run in an elevated PowerShell on a SharePoint Server.
Run the script in an admin powershell
Running the script with the provided xml would give you the following output:
Search Service Application:Search Service Application -exists
This will result in the following configuration in the central admin:
3 new content sources
one example with the urls set
… and the crawl schedules!
Important: This script does not remove or rename content sources (it simple can’t detect those changes)! If you want to rename an existing content source you can either delete the content source in the central admin or rename it there and in the xml file.
Important: Please keep in mind that deleting a content source or changes source addresses within a content source deletes items within your index! There will be an automatic cleanup of the index once you remove urls from a content source and you have to recrawl the items if you still need them!
If you change the parameters in the script, e.g. the crawl schedule and run the script again – the script will update the content source for you. So if you have a different environment with the same configuration, just copy both files – if they are different, you have to adjust the config file accordingly.
How to get the script
I released the script based on the MIT license in this GitHub repository:
I truly was an interesting year – I totally enjoyed it. Professionally and personally.
Stackexchange / #SPHELP
I started actively to share my experiences and knowledge arround SharePoint and especially about SharePoint Search – I watched the tag #shelp on twitter and started to participate on sharepoint.stackexchange.com – I even got some badges and over 1.000 points:
Blogging is a lot of work for me – English is not my first language, but I reached a stage where I can now enjoy writing and sharing my thoughts – even when I know its not perfect – maybe in 2014 I find some time to improve my writing skills, otherwise please apologize my errors. What I actually wanted to say: I invested a lot of time in this blog, answered each and every comment (roughly 100), wrote 44 blog posts and received roughly 50.000 views this year – wow, this is fun! Compared to last year this is 10x more – thanks for reading and following my stuff, motivates me a lot! Here is a list of my most viewed posts in 2013:
1, 2, and 5 are my personal favorites – it was great discovering how to build these solutions – I even used them for several client projects. But the most fun I had was writing this one: SharePoint 2013 Search Preview for Documents hosted in SharePoint 2010. The help and feedback I received from the SharePoint Community for this one was amazing! I even got contacted by Microsoft why and how I built it…
I am one of the guys who thinks that certifications really help – so I try to do plenty of them (it’s a good motivation for my team, too). This year I only did 3 certifications (070-417, 071-488, 071-489), next year resolution (one of them) is to finish all SharePoint 2013 certs.
I contributed to several Open Source SharePoint Solutions – and I even started my own ones:
Giving back solutions & knowledge and receiving questions about solutions I share is really worth some nights and weekends –. sounds crazy, but try for yourself! I would not be where I am right now without the professional solutions and shared knowledge floating around in the SharePoint Community.
I spent countlessly hours creating webcasts – rerecording them – deleting them – in both English and German (the German SharePoint community needs some love, too). I want to do more of them, even do a recording of paid tools (Avepoint, SPCAF, SPDockit, … ) – it takes time, it’s a good practice for me – and its FUN! More to come in 2014.
Conferences / Speaking
Conferences are great – meeting twitter friends personally is great, I even had the honor to chat with THE one Todd Klindt at TechEd in Madrid (that is always super fun!!!) – good stuff. I want more of it – hopefully I can convince my boss to send me to SPC14… the new year will tell! Additionally I went to two or three SharePoint User Groups in Munich – next year I will speak at least at one in February/March. Two “conference” highlights in 2013 were the two ShareCamps. One in Munich and the second (and first one ever) in Vienna, Austria. In Munich I gave two presentations, one about Search and the other one about SharePoint Tools – see the recap here. I even try to speak at some international conferences – the biggest one will be the free SP24 conference – I could need some votes here: http://list.ly/i/419694
It was a great SharePoint year – I had very exciting new projects, a migration and two SharePoint from scratch projects – one very challenging. SharePoint 2013 gains more and more traction in Germany – 2014 there will be more of it. Hopefully I get rid of all SharePoint 2007 projects – this old stuff… not good for me, not good for my clients (I am not saying its always the tool!).
I have more booked search training for 2014 than I had in 2013 this time – and with the enhanced SharePoint 2013 search there will be many challenges! And I hope that SP1 for SharePoint 2013 brings more of it – maybe a push api for Search Indexing?
Hosting on Azure for Dev/Test will accelerate – even in Germany.
Hosting on azure is pricey – but its worth it. You only pay what you use, perfectly for dev/test in the cloud.
Apps Apps Apps
I did not see that many SharePoint Apps – but I will write my own and hopefully my clients will jump on the apps train!
Happy New Year!
Phew – more than I thought, what a year! But now its time to say thanks for reading so far – next year will be exciting, too!
I wish you all a Happy New Year – may it be even better than the last and with lots of fun! See you around!
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.