Blogs
Overview
You are browsing a SharePoint site and you receive following error message.
Diagnosis
The web apps are not opening from IE/Chrome/Firefox outside of SharePoint Server. Tried to open the web app from each WFE server but the web app are opening fine from WFE1 server but not from WFE2. The below error found in ULS Log on WFE1.
UnauthorizedAccessException for the request. 403 Forbidden will be returned. Error=Exception of type ‘System.Web.HttpUnhandledException’ was thrown.
at System.Web.UI.Page.HandleError(Exception e)
at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
at System.Web.UI.Page.ProcessRequest(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)
at System.Web.UI.Page.ProcessRequest()
at System.Web.UI.Page.ProcessRequest(HttpContext context)
at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
Cause
The "Everyone’s" Read permission removed from bin directory for the web apps which is not opening.
Solution
Go to each web app’s Bin directory Properties. Add"Everyone" and provide read access. Do IISRESET. Everything will open fine.
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:25pm</span>
|
This week, I’m at the SHRM Conference in Las Vegas with 15,000 of my HR brethren, and there is definite excitement in the air. Marcus Buckingham started off Monday’s general session by talking about strengths, leadership, and performance. One of the specific areas he focused on was the importance of a good team leader for the performance of the organization. Many of the processes companies put in place are focused on improving the organization or supporting the employee populace, but there is a dearth of support for the team leaders. Even a traditional engagement survey doesn’t necessarily serve the needs of...
SHRM
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:25pm</span>
|
Overview
PerformancePoint Services in SharePoint 2013 is a service that provides flexible tools to help us to create rich dashboards, scorecards, and key performance indicators in web browsers. PerformancePoint Service helps management users to analyse data for better business decisions.
Steps to Create Performance Point Service
First, we have to create a Business intelligence site as shown below:
Once the site gets created, go to "Site Contents"
For designing a dashboard, click on the "Dashboards" library.
Go to ribbon "PERFORMANCE POINT" and click on "Dashboard Designer" as shown below
Launch the Dashboard Designer as shown below
So the first step is to "create a Data Connection". Right click "Data Connection" and click on "New Data Source"
Various Types of Data Connection available in listed above. Choose "Multidimensional" and "Analysis Services". Click the OK button as shown in figure below:
Create a connection string by specifying Server Name, Database Name and Cube name. Then Click on Test Data Source. Rename Data Connections as "PPSSSASCon" and Save this. Right click "PerformancePoint Contents", New and select Report as shown below:
Select Analytic Chart and Click on Ok
Choose Data Source you would like to use. Choose "PPSSSASCon".
Click finish button. Rename the report as "PPS Chart"
From the Details pane which is on our right side, drag and drop measures and dimensions depends on the requirement.
Right click "PerformancePoint Contents", New and select Dashboard as shown below:
Click on Dashboard to Create a New Dashboard Page Template as shown below:
Select One Zone Report and Click on Ok
Rename it as PPS Dashboard
Drag and Drop the PPSChart from the Right pane details of the Performance Point Content as shown below:
Save the Dashboard and Deploy the SharePoint site, Right click the "PPS Dashboard" and click on "Deploy to SharePoint"
Create a filter for filtrations of Dashboard
Select members sections and Click Ok
Select Analysis Service Connection "PPSSSASCon" and Next
Select "Select dimendion" button from right side and choose "Calendar Year Desc" as filters and Click on Ok.
Then Select all filters members
After the two steps is done then select "filters measures" Sales Amount
Click Next and "Choose Multi Select Tree"-> Click Finish and Rename Filters as "PPSFilters"
Again Go to "PPSDashboard" and drag and drop the PPS Filters to Dash board section
Now the following PPS Dashboard is ready to use after the deployment
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:24pm</span>
|
I’m John Cosgrove, an evaluator who is committed to utilization-focused evaluation. I am currently working with community colleges around the country to improve evaluation efforts and the use of data for continuous improvement. Clients indicate they want evaluation and data to drive continuous improvement and decision-making. Although a good place to start, data collection alone won’t get the job done. In her excellent article, Data Don’t’ Drive, Alicia Dowd reminds us that data alone won’t lead to continuous improvement.
I remember sitting in a faculty session at the University of Michigan Assessment Institute and listening to Richard Alford discuss the Craft of Inquiry. It was the end of the day and with all apologies to Dr. Alfred, I must admit I was thinking more about crafting dinner plans than inquiry, but then he made a very simple, yet powerful statement: "You don’t make the pig fatter by simply weighing it every day".
Assessment, evaluation, data collection—whatever you want to call it—must be more than keeping score. If we don’t learn something and then take action from what we learn, we are simply recording data for the sake of recording data. As colleges are further inundated with the call for evaluation data from stakeholders, including legislators and funding agencies, they would do well to remember to structure such efforts with a meaningful culture of inquiry.
People engaged in the development of public questions and the thoughtful interpretation of data will drive continuous improvement. We should expand evaluation efforts to determine not only what works, but why it works. We offer the following framework help link questions, data collection, interpretation and action.
INQUIRE—What Do We Want To Know? Define the specific evaluation questions.
DISCOVER—What Do We Know? Identify data sources and methods of data collection.?
INTERPRET—What Does the Data Tell Us? Work with stakeholders to analyze and interpret results/data.
DEVELOP—What Actions Need To Occur? Use results to develop strategies for continuous improvement and further evaluation.
Rad Resources:
Data Don’t Drive: Building A Practitioner-Driven Culture of Inquiry To Address Community College Performance. A Lumina Foundation for Education Research Report, 2005.
Specify The Key Evaluation Questions. Better Evaluation: Sharing Information To Improve Evaluation, 2014
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Related posts:
Judy Savageau and Laura Sefton on Creating Standard Operating Procedures for Evaluation Projects
QUAL Eval Week: Eric Barela on providing a detailed description of qualitative inquiry choices and processes to clients
Cassandra O’Neill on a Flexible Thinking and Action Planning Tool
AEA365
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:24pm</span>
|
Background
You are using O365 for your SharePoint. You have created several Sub Sites in the process. Now you would like to delete these sub sites. But would like if these sub site deletion can be automated.
This below code snippet will delete all sub-sites in a SharePoint Online site collection. The code base is based on SharePoint Client object-model. For On-Prem you can use PowerShell, but for O365 below code is a better approach.
Details
Open Visual Studio, start a new console application with name "CA_deleteSubSites". Paste the below code and change the variable wherever required.
using System;
using System.Text;
using System.Threading.Tasks;
using System.Diagnostics;
using System.Configuration;
using System.Collections.Generic;
using Microsoft.Online.SharePoint.TenantAdministration;
using Microsoft.SharePoint.Client;
using Microsoft.SharePoint.Client.Utilities;
using System.Security;
using System.IO;
using System.Linq;
namespace CA_deleteSubSites
{
classProgram
{
staticstring trgsiteuser = ConfigurationManager.AppSettings["trgsiteuser"]; //xxx.onmicrosoft.com user
staticstring mainpath = "https://xxxx.sharepoint.com";
staticvoid Main(string[] args)
{
string trgsiteuserpass = ConfigurationManager.AppSettings["trgsiteuserpass"];
string path = "https://xxxx.sharepoint.com/sites/demo"; // site to be deleted
SecureString passWord = newSecureString();
foreach (char c in trgsiteuserpass.ToCharArray()) passWord.AppendChar(c);
deleteallSubWebs(path, passWord );
}
publicstaticvoid deleteallSubWebs(string path, SecureString passWord)
{
try
{
//connect to the root site
using (ClientContext clientContext = newClientContext(path))
{
clientContext.Credentials = newSharePointOnlineCredentials(trgsiteuser, passWord);
if (clientContext != null)
{
Web oWebsite = clientContext.Web;
clientContext.Load(oWebsite, website => website.Webs, website => website.Title);
clientContext.ExecuteQuery();
if (oWebsite.Webs.Count == 0)
{
Console.WriteLine(path);
oWebsite.DeleteObject();
clientContext.ExecuteQuery();
Console.WriteLine("deleted.." + oWebsite.Title);
}
else
{
foreach (Web orWebsite in oWebsite.Webs)
{
string newpath = mainpath + orWebsite.ServerRelativeUrl;
deleteallSubWebs(newpath, passWord);
}
Console.WriteLine(path);
oWebsite.DeleteObject();
clientContext.ExecuteQuery();
Console.WriteLine("deleted.." );
}
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
}
}
Conclusion
Hope this will help you to delete all sub site in a site but use it with caution to ensure you know what you are about to delete!
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:23pm</span>
|
Hi, I’m Lisa Melchior, President of The Measurement Group LLC, a consulting firm focused on the evaluation of health and social services for at-risk and vulnerable populations. In response to Sheila B. Robinson’s recent post that reported what AEA 365 readers said they want to see in 2015, I’m writing about developing, sharing, and storing lessons learned from evaluation. Although this is written from the perspective of evaluation at the initiative level, it could also apply to lessons learned by an individual program.
The United Nations Environment Programme gives a useful definition of lessons learned as "knowledge or understanding gained from experience." In a grant initiative, lessons learned might address ways to implement the projects supported through that initiative; strategies for overcoming implementation problems; best practices for conducting services (whether or not the projects employed all of them); strategies for involving key stakeholders to optimize the outcomes of the projects and their sustainability; and ideas for future directions. Statements of lessons learned are an important outcome of any grants initiative; the richness and complexity of those statements can be, in part, an indicator of the overall success of the initiative. Funders often utilize the lessons learned by their grantees to inform the development of future investments.
Hot Tips:
Developing lessons learned. If possible, work with the funder to collect examples of lessons learned using the funder’s progress reporting mechanism. When the evaluator has access to such reports, qualitative approaches can be used to catalog and identify themes among the lessons learned. Another benefit of integrating the documentation of lessons learned into ongoing programmatic reporting is that trends over the life of a project or initiative can emerge, since many initiatives request this type of information from grantees on a semi-annual or quarterly basis. Active collaboration between funder and evaluator is key to this approach.
Sharing lessons learned. Don’t wait until the end of a project to share lessons learned! Stakeholders can benefit from lessons learned in early implementation. For example, my colleagues and I highlighted interim outcomes and lessons learned during the first three years of the Archstone Foundation’s five-year Elder Abuse and Neglect Initiative in an article in the Journal of Elder Abuse and Neglect.
In a more summative mode, toolkits are a useful vehicle for sharing lessons learned with those interested in possible replication of a particular program, model, or initiative. Social media and blogs are great for more informal sharing.
Storing lessons learned. Qualitative data tools such as NVivo are invaluable to organizing lessons learned.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Related posts:
Tania Jarosewich and Linda Simkin on Using the College Access Network Survey
Catherine Jahnes on Learning Communities
Michael Duttweiler on Talking Your Way Into a Logic Model
AEA365
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:23pm</span>
|
My phone awakens me that it is 4:45 am. Time to shake off a weekend worth of Dead Shows and head to Sin City to hang out with my Sisters & Brothers in the Human Resources profession. What could possibly go wrong? Having been to the SHRM Annual Conference nearly 10 times, my event navigation is far more streamlined than a first time attendee. I head to the Dice Bloggers lounge to pick up my badge. Reframing EngagementFor decades, Gallup has been the default source of employee information gathering. Their survey questions written in 1993 remain the benchmark for workforce engagement in...
SHRM
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:23pm</span>
|
Objective
This blog is created to help people who want to learn OLAP (Online analytical processing) Cube in SSAS. It also helps to analyze data in multi-dimensional format for smarter business decisions.
This blog solutions has been created using Adventure Works DW 2012 database.
Create Data Sources
Select Data sources in Solution Explorer and right click on data sources -> New Data Sources.
The following screen appears…
Then Click Next…
There is no existing connection. So click on New button
The following connection manager screen appears. Enter the Server Name Where SQL Server is Installed and Choose Database name and Click Ok.
Now we can see a connection string created in left side of the Data Source Wizard, Click Next..
Click on "Use the service account" Radio button and Click Next
Enter a Data Source Name and Click on Finish Button.
Create Data Sources View
Select Data source View in Solution Explorer and right click on data source view -> New Data Sources view. The following screen appears…
We can see a data sources is present in the left section of the Data source view wizard.
Click Next…
The following screen appears with all the tables that exists in the adventure works database.
Select "DimDate", "DimProduct", "DimCustomer" and "FactInternetSales" from the Available objects.
Click the ">" button to move those objects to Included objects.
Click Next…
Click on finish button to create data source view.
The following screen appears after creation of Data source view.
Create New Named Calculation
Select and right click "DimDate" to create New Named Calculation.
The Year format will look like CY2012, CY2013
Click Ok and the following Screen appears. The "CalendarYearDesc" named calculation is created in "DimDate" Dimension.
Now to create three dimension like Date, Product and Customer (Create Data Dimension)
Select to create New Dimension. Click Next
Select "Use an existing table" radio button and click Next
Select "DimDate" in Main Table.
Select "DateKey" as Key column and "FullDateAlternetKey" as Name column. Click Next
Select the column name which will appear in the cube dimension
Change the attribute for Year, Semester, Quarter and Month from regular to specific.
Click on Attribute Type to get the list of all attribute under the calendar
Type a name of the New Dimension and Click on Finish Button to Create the Date Dimension
Next is to create the Product and Customer dimension with the above mentioned steps.
Create Hierarchies
In Dimension structure Tab sequentially drag and drop "CalendarYear, CalendarSemester, CalendarQuarter, Month and Date" to Hierarchies.
Create Attribute Relationships
Go to attribute relationship tab. Select and right click on "Date -> CalendarQuarter". Change the Name to Month in Source Attribute and Change the Relationship Type to Rigid and Click on Ok button.
Again follow the abovementioned steps for CalendarSemester and CalendarYear as mentioned below:
Select and right click on "Date -> CalendarSemester". Change the Name to CalendarQuarter in Source Attribute and Change the Relationship Type to Rigid and Click on Ok button.
Select and right click on "Date -> CalendarYear". Change the Name to CalendarSemester in Source Attribute and Change the Relationship Type to Rigid and Click on Ok button.
The final hierarchy will look like the following…
View Data in Browser
Go to Browser Tab and Click on Process. The following screen will appear. Click the Run button to process dimension data.
Create Cube
Select and right click on cube to create New Cube. Click Next…
Click on "Use existing tables" radio button and Click Next
Select the check box where fact tables appear and uncheck all the dimension check box and Click Next
Select the measure field of the fact table and Click next…
Uncheck the Dimension and click Next
Type a name of the Cube and Click on Finish button
The cube structure will look like this..
Cube Deployment
Go to Analysis Service project and go to properties. Type the server name cube deployment and Type the name of the SSAS database. Click Ok.
Right click on Analysis Services project and deploy the cube
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:22pm</span>
|
I’m Wendy Tackett, the president of iEval, part-time faculty at Western Michigan University, and a blogger at Carpe Diem: Make Your Evaluations Useful. I want to share about a fun way we get our evaluation clients engaged in evaluation…we call it Camp iEval!
The purpose of Camp iEval is to bring evaluation clients together who are working on similar programs and provide 1) training about understanding and using data, 2) analyses of current local data, 3) research on best practice strategies aligned to needs identified through data analyses, and 4) networking among colleagues.
Lessons Learned:
Location, location, location! We’ve hosted Camp iEval several times a year since 2010 in various formats - at my house, a client’s office, a hotel, and via Skype. We’ve found that the best location, by far, was my house! When people are comfortable, they’re much more receptive to open, honest discussions around data and sharing ideas for program improvements. We encourage informal attire and do a potluck lunch, making it a very relaxed atmosphere.
Variety is the spice of life! While we stick closely to the four components of the day, we change it up each time so people don’t get bored. We’ve done things like incorporated hands-on science experiments because the data showed science integration as a weakness, put on skits to illustrate the benefits of using evaluation, and solicited programs to give mini-presentations.
Have fun! Having fun while doing evaluation is one of the key tenets of our work. We have created silly awards (e.g., Miss Interpretation), gifted iEval blankets (it’s a running joke because I keep my house so cold), sang songs around a guitar, and eaten tons of homemade goodies.
Relationships are key! I’m sure you’ll agree that the most valuable time at any conference is the networking time, whether formal or informal. We plan formal networking time (i.e., specifically asking programs to share on strategies that have been successful based on our data analyses) and informal networking time (e.g., not a working lunch, general sharing at the end of the day). Because of the casual atmosphere of Camp iEval, the project staff feel comfortable sharing their own data with each other, asking deep questions, visiting each other’s programs, and knowing there are people they can go to for support.
Hot Tip: If you’re interested in finding out more about how to create your own evaluation camp, look for any of the iEval team members at Evaluation 2015 in Chicago. We’re submitting a demonstration proposal; but if we don’t end up presenting, we’d be happy to share more insight informally!
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Related posts:
LAWG Week: Patricia Moore Shaffer on STEM Evaluation in Informal Settings
Wendy Tackett and Joseph Trommater on Local Evaluation Capacity Building
Bloggers Week: Wendy L. Tackett on Carpe Diem
AEA365
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:22pm</span>
|
I just had the opportunity, no privilege, to talk with Joe Gerstandt in the blogger’s lounge at the SHRM 2015 Annual Conference & Exposition in Las Vegas. His business card states: "illuminating the value of difference". In our conversation, he passionately and persuasively talked about the importance of valuing difference by more than just labels, but also by conscious activism. I asked Joe what propelled his passion. The...
SHRM
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:21pm</span>
|
Objective
Suppose a manufacturing farm manufactures three products (Brand) say X,Y,Z and it has three sale points (Location) say A,B,C from where they sell those products. They keep a track of selling Amount of each product from each sale point on year basis in the following table that looks like below:
For the sake of simplicity I am keeping the schema very simple to understand. Each row in the diagram informs in which month of a year which Brand (product) got sold out from which Location and how much Amount is earned after selling that product.
Suppose it is required to find out how much amount of Product was sold out from each location on each Brand basis in the year of 2014 and on every month basis. So the matrix will look like as follows:
So we need to get the summation value of Amount sold for the intersection of Brand data points X, Y, Z to Location Data Points A, B, C and of Month Data points Jan, Feb etc.
So this is what pivoting means i.e. Intersection of data points to summarize the data.
Implementation
Pivot query basically has three parts. Considering the above table, let’s us understand this properly:
Part 1: It defines the actual data points of the table to be displayed as header. What data point is, has already been defined.
Part 2: This is the actual dataset from the table. Note that the column names of the table are given here to create the dataset as like usual simple query.
Part 3: This is the actual pivoting of the data.
After execution of the above query we are able to find out the total sale of Brand X, Y, Z for the locations A, B, C in the year of 2014 per month basis and that looks like as follows:
Conclusion
In this blog I’ve basically tried to make you understand how we can use the Pivoting in SQL Server that gives the insight of data by rotating rows and columns of a table from multidimensional perspective. Forming the interactive parameterized query we are able to consume/display the direct returned pivot dataset without having any manipulation in DOT Net application, SSRS etc.
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:21pm</span>
|
Hi, Veronica Olazabal, with the Rockefeller Foundation here. Having worked in the international development sector for some time and driven by the belief that evaluation is a bridge to effective poverty solutions, assessment of integrated development programs is a passion of mine. At Nuru, a social venture that aims to eradicate poverty in remote rural areas in Africa, we use a combination of approaches to measure the breadth and depth of our programs’ impact. We measure the "parts"—impact of each program—and the "sum of the parts"—the composite programmatic impact on poverty. In operationalizing this strategy over the last few years, here’s what we’ve learned:
Be clear about your definition of poverty from the start. Is poverty based on income or something else? At Nuru, poverty is multidimensional and aligned with Amartya Sen’s definition on access to meaningful choices.
Find a tool that aligns with your definition of poverty. Rad Resources: For multidimensional poverty, try the Multidimensional Poverty Assessment Tool (MPAT) or Multidimensional Poverty Index (MPI). For income-based poverty, try the Progress out of Poverty Index (PPI).
Define your comparison. We have learned from others, and from our own trial and error, that we need a point of comparison. We use comparison farmer households to follow year-to-year changes, national data tools for broader regional comparisons, and comparison of new to returning Nuru farmer households over time.
Articulate clearly how the program will lead to poverty change. Define your theory of change: Is change a step-by-step process? Is it a graduation model? Is it driven by a vertical solution? At Nuru, our programming is sequentially layered. This intentional design allows us to isolate the impact of each program on the sum as each is operationally layered on.
Manage your stakeholder’s expectations. Many poverty experts believe that change takes many years and is too complicated to actually measure. Others believe that catalytic programs may drive marginal levels of change faster. Ensure that your stakeholders’ expectations are clear, both in commitment to programming and measurement.
Don’t forget costs. At Nuru, we are deliberate about cost-effectiveness and achieving financial sustainability. Thus, we are intentional about ensuring that every data point counts toward strategic decision-making (i.e., how to scale, iterate, etc.). If we can’t use it, we lose it.
We are still learning and expect to "calibrate" our programs and measurement strategy over time. If you have experience in this space, please reach out.
Rad Resources
Oxfam Blog: How can we improve the way we measure poverty? by Duncan Green
Macro-level Drivers of Multidimensional Poverty in Sub-Saharan Africa: Explaining Change in the Human Poverty Index by Heath Prince
Photo complements of Flickr, Grapes of Math by Mark Turnaukcas
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Related posts:
EPE TIG Week: Juha Uitto on Sustainability Evaluation and the Need to Keep an Eye on the Big Picture
EPE TIG Week: Anna Williams on A Welcome New Paradigm for Sustainable Development
Kristi Pettibone on Evaluating Environmental Change Strategies
AEA365
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:20pm</span>
|
Introduction
We can create a multiple file upload tool with progress bar using jQuery. It is fully customizable and also helps us to update the metadata at the same time. Best part is that it is not involved in any page load and it helps to understand how long it will take to upload large files.
Prerequisites
1) Go to http://www.uploadify.com/ and download the free flash version. We can also use HTML5 version but it involves some license.
2) Visual Studio 2012
Detailed Method
1) Create a SharePoint 2013 Empty project and give it a proper name.
2) Add new item and add a visual webpart with a proper name.
3) Create a mapped "layout" folder for this project.
4) Now copy all the files downloaded from uploadify and paste it to our layout mapped folder.
5) Now in the user control page copy the below code with some modification of your project folder.
<link rel="stylesheet" type="text/css" href="/_layouts/15/{mapped_folder_name}/uploadify/uploadify-new.css">
<script type="text/javascript" src="https://code.jquery.com/jquery-1.10.2.min.js"></script>
<script type="text/javascript" src="/_layouts/15/{mapped_folder_name}/uploadify/jquery.uploadify.js"></script>
6) Now create the script for upladify functionality
<script type="text/javascript">
$(function () {
$("input[id*='fuMiltiFileUpload']").uploadify({
‘formData’: { ‘strSiteURL’: ", ‘strLibraryName’: ", ‘strMetadata’: " },
‘height’: 30,
‘swf’: ‘ /_layouts/15/{mapped_folder_name}/uploadify/uploadify.swf’,
‘uploader’: ‘/_vti_bin/anonsvc/{project_folder_name}/Upload.ashx’,
‘width’: 120,
‘onFallback’: function () {
//alert(‘Flash was not detected.’); if flash is not installed in the browser
},
‘onUploadError’: function (file, errorCode, errorMsg, errorString) {
alert(‘The file - ‘ + file.name + ‘ - could not be uploaded: ‘ + errorString);
},
‘onQueueComplete’: function (queueData) {
//Can do whatever we want to do after all the files uploaded successfully.
}
});
});
</script>
<asp:FileUpload ID="fuMiltiFileUpload" runat="server" />
7) Look into the "Uploader" path given into the above script.
8) Here we kept the code file under "anonSVC" . "/_vti_bin/anonsvc/{project_folder_name}/Upload.ashx". Because when the files are posted from a SharePoint page and trying to access our uploader script there is and authentication issue. So we will keep our uploader script under annosvc folder to access it anonymously.
9) This Upload.ashx is nothing but a generic handler which has all the codes to upload files on a specific "List/Library".
How to create a generic handler in SharePoint
1) Create a SharePoint mapped folder and under "ISAPI > annonsvc" create a folder for this project.
2) Add new item and create a general text file with ".txt" extension.
3) Now rename it and give it a name as "Upload.ashx".
4) Now open the Upload.ashx file and paste the below code
<%@ Assembly Name="Microsoft.SharePoint, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>
<%@ WebHandler Language="C#" Class="Upload" %>
using System;
using System.Web;
using Microsoft.SharePoint;
publicclassUpload : IHttpHandler
{
///<summary>
/// You will need to configure this handler in the Web.config file of your
/// web and register it with IIS before being able to use it. For more information
/// see the following link: http://go.microsoft.com/?linkid=8101007
///</summary>
///
#region IHttpHandler Members
publicbool IsReusable
{
// Return false in case your Managed Handler cannot be reused for another request.
// Usually this would be false in case you have some state information preserved per request.
get { returntrue; }
}
publicvoid ProcessRequest(HttpContext context)
{
string strSiteUrl = context.Request["strSiteURL"];
SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite site = newSPSite(strSiteUrl))
{
using (SPWeb web = site.OpenWeb())
{
}
}
});
}
#endregion
}
5) Under "ProcessRequest" method we can write our own upload code.
6) In the script we can pass our own data as a Jason format under ‘formData’ property and we can access those value in handler file under context.Request[""] parameter.
7) As we are accessing this handler file anonymously we are using SPSecurity.RunWithElevatedPrivileges(delegate() to open the web.
Deploy and add this webpart under project
1) Deploy the webpart using visual studio or PowerShell.
2) Open the site in SharePoint designer.
3) Create a normal aspx page under "SitePages" library. Open the page on Advance edit mode and add the specific webpart using SharePoint designer.
4) Run the page in browser and try to upload some files.
Debug the generic handler
1) Run the handler page in the browser with its full path and see if there is any error. http(s)://{site-url}/_vti_bin/annonsvc/{project_folder_name}/Upload.ashx
2) We can also attaché this file with "w3wp.exe" process to debug.
Note: If you are unable to see the flash upload button please add the MIMEtype to your SharePoint site using the below PowerShell command.
Write-Host"This script will check if a particular MIME Type is excluded from the AllowedInlineDownloadedMimeTypes list when STRICT Browser File Handling Permissions are set on the Web Application"-foregroundcolorDarkcyan
$webAppRequest= Read-Host"What is the name of your Web Application? i.e. http://<serverName>"
$webApp= Get-SPWebApplication$webAppRequest
$mimeType= Read-Host"Which MIME Type would you like to confirm is included in the AllowedInlineDownloadedMimeTypes list for $webApp ? i.e. application/pdf"
If($webApp.AllowedInlineDownloadedMimeTypes -notcontains"$mimeType")
{
write-host"$mimeType does not exist in the AllowedInlineDownloadedMimeTypes list"-foregroundcolorYellow
$addResponse= Read-Host"Would you like to add it? (Yes/No)"
if($addResponse-contains"Yes")
{
$webApp.AllowedInlineDownloadedMimeTypes.Add("$mimeType")
$webApp.Update()
Write-Host"The MIME Type ‘ $mimeType ‘ has now been added"-foregroundcolorGreen
$iisresponse= Read-Host"This change requires an IIS Restart to take affect, do you want to RESET IIS now (Yes/No)"
if($iisResponse-contains"Yes")
{
IISRESET
Write-Host"IIS has now been reset"-foregroundcolorGreen
}
else
{
Write-Host"IIS has not been reset, please execute the IISRESET command at a later time"-foregroundcolorYellow
}
}
else
{
Write-Host"The MIME Type ‘ $mimeType ‘ was not added"-foregroundcolorRed
}
}
else
{
Write-Host"The MIME Type ‘ $mimeType ‘ already exists in the AllowedInlineDownloadedMimeTypes list for this Web Application"-foregroundcolorYellow
}
To run the above script you will need to copy it into Notepad and then save it with a filename like strictMimeType.ps1. Once saved you can run it from a PowerShell window like so.
#Use the cd command to navigate to the folder it is in
cd C:\powerShellScripts
#Once you type in’.\’ you can press tab to cycle through files in this location
.\strictMimeType.ps1
MIMEType for flash is "application/x-shockwave-flash"
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:20pm</span>
|
I had the distinct pleasure of seeing my friend Steve Browne speak this morning at SHRM. His session was intended to fire up the audience, and I’d say it was a smashing success. One of his comments was powerful, and I thought it deserved to be repeated here because I talk about certification quite a bit. If your certification is purely about getting recertification hours, having letters after your name, and trying to use that as a way to get credits, then you’re wasting your time and your organization’s time. Go ahead...
SHRM
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:19pm</span>
|
Hi there, Liz Zadnik here, bringing you another Saturday post focused on practitioner experiences and approaches. Today I’m going to focus on a recent (and recurring) experience of getting others excited about evaluation and capturing information.
It is a source of pride that many of my colleagues have said, "Liz, you bring such an enthusiasm for evaluation - it really helps getting people engaged and interested." Now, I’m not the most knowledgeable or experienced person, but I do know that evaluation and assessment hold an important place in the present and future of the anti-sexual violence movement.
Hot Tip: During a recent webinar I was facilitating, I was talking about sharing data and building trust with community members. I was trying to think of how to explain it and used the analogy of constellations: we do not "own" the stars, but have drawn connections to tell stories about the past, present, and future.
Lesson Learned: Look up! Sometimes this could be literal or figurative. But getting some perspective and being creative can go a long way in engaging people in conversations about evaluation. In my experience, folks often see numbers and equations and statistics (which is fair and true), and this prevents them from seeing how evaluation can help tell a story. Their story.
Rad Resource: The Texas Association Against Sexual Assault released a new toolkit presenting activity-based assessment as a strategy for collecting evaluation data while also implementing a prevention and education program. I’ve found this to be a great way to broaden people’s minds to how evaluation can work for them.
I hope this post has helped illuminate the inner workings of a practitioner passionate about evaluation. My time with aea365 has been incredible so far - I have learned so much and look forward to hearing your thoughts and comments!
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Related posts:
Jen Przewoznik on Conducting Research With and within LGBTQI+ Communities: We Don’t Know Exactly What Works, but We Have a Pretty Good Sense of What Doesn’t
Susan Eliot on a Code of Conduct
LGBT Week: Käri Greene on Issues of Gender and Sexuality in Evaluation
AEA365
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:19pm</span>
|
Overview
This is Part II of the blog sequence that deals with small utilities for management of Office 365 sites. Part 1 of this series demonstrate use of C# CSOM to delete sub sites. The current article presents an alternate option of using powershell to delete sub sites of an Office 365 site.
SharePoint Online powershell commands are very limited as of yet. That leaves us to use CSOM using powershell to perform necessary management activities for Office 365. Important point to note here is that you need to make use of few SharePoint Client side DLLs to perform the activities from powershell. The below script is specifically designed to recursively delete all sub sites under a given site on Office 365 tenant. However the script can easily be updated to perform other activities with all sub sites as well. I will follow up this article with some other utilities in near future.
Solution
Use SharePoint Online Management Shell for executing the script. If you do not have the shell already you can download it from here http://www.microsoft.com/en-in/download/details.aspx?id=35588
Build the credential that you would use to connect to SharePoint Online. You must use the tenant admin credential for this. The tenant admin credential looks like adm.<userid>@<your tenant>.onmicrosoft.com
$credentials = Get-Credential
# This would prompt for admin userid and password. Fill in those details.
# Register SharePoint Client DLLs. If you have Visual Studio 2012 or 2013 installed you might look up for these DLLs by searching in your system drive for "microsoft.sharepoint.Client*.dll"
# Alter the below DLL paths for your computer
Add-Type -Path "C:\<ClientDLLS>\Microsoft.SharePoint.Client.dll"
Add-Type -Path "C:\<ClientDLLS>\Microsoft.SharePoint.Client.Runtime.dll"
Add-Type -Path "C:\<ClientDLLS>\Microsoft.SharePoint.Client.Taxonomy.dll"
# connect/authenticate to SharePoint Online and get ClientContext object..
$clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($url)
$clientContext.Credentials = $credentials
if (!$clientContext.ServerObjectIsNull.Value)
{
Write-Host "Connected to Office 365 site: ‘$Url’" -ForegroundColor Green
}
# Load the root site from the context
$rootWeb = $clientContext.Web
$clientContext.Load($rootWeb)
$clientContext.ExecuteQuery()
# Create a recursive delete function
function deleteWeb($web)
{
$clientContext.Load($web)
$clientContext.ExecuteQuery()
Write-Host "Web URL is" $web.Url
$subwebs = $web.Webs
$clientContext.Load($subwebs)
$clientContext.ExecuteQuery()
Write-Host "Child count: " $subwebs.Count
if ($web.Webs.Count -eq 0)
{
if ($web -ne $rootweb)
{
Write-Host "Deleting Site : " $web.Url
$web.DeleteObject()
$clientContext.ExecuteQuery()
}
}
else
{
foreach ($subweb in $subwebs)
{
Write-Host "SubWeb URL is" $subweb.Url
deleteWeb($subweb)
}
Write-Host "Deleting Site: " $web.Url
deleteWeb($web)
}
}
# Call the function
deleteWeb($rootWeb)
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:19pm</span>
|
I’m Tom Chapel. My "day job" is Chief Evaluation Officer at the CDC where I help our programs/ partners with evaluation and strategic planning. I took on both roles because large organizations do strategic planning and evaluation in different silos, even though both silos start with "who are we?" "what are we trying to accomplish?" and "what does success look like?"
In response, we’ve crafted an approach to strategic planning which employs logic models, but in a different way than for evaluation. The key steps: Compose a simple logic model of activities and outcomes (or what some might call a "theory of change"). I want stakeholders to understand the "what" of their program (activities) and the "so what" (the sequence of outcomes/impacts). Usually, we add arrows to reflect the underlying logic/theory.
Choose/affirm an "accountable outcome". It’s great to include "reduced morbidity and mortality" in the model as a reminder of what we’re about. But be sure to explain that these are areas for "contribution" and not outcomes attributable solely to their efforts.
Have the "output talk". The model shows which activities drive which outcomes. Outputs are the chance to define how the activity MUST be implemented for those outcomes to occur. This discussion sets up creation of process measures for the evaluator later on but at this point provides clarity for planners and implementers on the levels of intensity/quality/quantity needed.
Help them identify "killer assumptions". There are dozens of inputs and moderating factors (context) over which a program has less or no control. Look for ones so serious that if that input or moderator is not dealt with the program really can’t achieve its intended outcomes. Depressing as this exercise can be, it spurs creative thinking— how might we work around/refine our activities to accommodate it?
Tie it all together with a (short) list of key strategic issues. Hit the high points —mission, vision, SWOT and move on to goals and objectives. This technique avoids the painful wordsmithing that often comes with traditional strategic planning.
Lessons Learned:
Use existing resources. The organization may have a mission and vision, an existing strategic plan, a business plan, or a set of performance measures. Extract the starter model from these resources so they see the logic model as a visual depiction of how they already think about their program and not something completely new.
Do the process in digestible bites and WITH the program. You want people to follow the storyline and that happens more often if they are part of the model construction.
If in return for minimal word-smithing we inflict endless arrow-smithing, fatigue will soon set in. Declare victory when the group is 85% in agreement with the picture.
Rad Resource: Phillips and Knowlton: The Logic Model Guidebook (2nd edition)
The American Evaluation Association is celebrating Logic Model Week. The contributions all this week to aea365 come from evaluators who have used logic models in their practice. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Related posts:
Logic Models Week: Debra Smith and Galen Ellis on How Logic Models Can Be Used to Develop Evaluation Systems
SIOP Week: Dale S. Rose on Organization Development: A Program Worth Evaluating (Logically)
Logic Models Week: Ian David Moss on Why Logic Models Don’t Have to Suck
AEA365
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:19pm</span>
|
A poetry slam, swanky Marcus Buckingham sware' and an 80's jam couldn't keep me from waking up at 6am to hustle over to the HR Capital of the Universe. I step into the Vegas heat, my heart pumping pure mini-bar, then jump into a bus blasting a Rico Suave / Maria Maria mash up at 10,000 watts. It's Day 3 at the SHRM Annual Conference and there is no time for Tom Foolery... We've got knowledge to consume. What a better way to wake up than with HR's Global Ambassador of...
SHRM
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:19pm</span>
|
One of the most challenging thing organizations face today is to manage the amount of data produced on a daily basis. IDC forecasts a 44% increase in data volumes between 2009 and 2020 which means content management will become tougher going forward.
Learn how to maximize OneDrive with Office 365 and have virtually ubiquitous access to open, edit and share data from anywhere. Also, Office 365 users can now stop discriminating about which files or data deserve to occupy the cloud and which should remain local.
Microsoft’s research indicates that three out of four users have less than 15 GB of local files on their PC/Laptop, so it believes that the new 15 GB limit in OneDrive will accommodate the needs of the vast majority of users.
Key Takeaways:
Why use OneDrive with Office 365
Comparison between OneDrive and other cloud storage products
OneDrive Migration
Integration with Office 365
If you register by Jul 25th, 2014 you will be entered to win an all new iHealth Wireless Activity & Sleep Tracker of retail value $79 and we are also raffling for a Water Resistant Wireless Bluetooth Speaker of retail value $139 for those who stay on for the full webinar.
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:19pm</span>
|
We are Debra Smith and Galen Ellis, two evaluators who discovered through AEA that we share a common method of using logic models to facilitate systems thinking with our clients. Many people think logic models are a complicated exercise with little value. Some are downright cynical, saying they tend to represent "a tenuous chain of unproven assumptions used to justify the pre-determined program model" (Public Health Director).
We both use a two-phase logic model development process: first, we help our clients develop a balcony view "theory of change" by identifying the global goal or vision and mapping key resources, strategies and outcomes. Clarity in Phase I, makes going to Phase II—identifying outputs and short, mid and long-term outcomes and measures—more manageable and meaningful.
Debra: I first used this approach while working with a museum education department to develop an evaluation system for their programs. We mapped the overall theory of the department, tracking resources and activities leading to their long-term vision, which they described as "the community loving the museum." Staff were then able to develop logic models for their individual programs, and then a system that streamlined the data they collected within and across programs.
Galen: I have facilitated logic model processes for the development of agency-wide evaluation systems with several organizations in this two-step process. The theory of change process helps the client articulate how their activities and the outcomes they expect fit with their agency’s values and mission. Then I work with each individual program/project within the organization to develop its own logic models that link to the agency’s broader theory of change. This shifts the culture of the organization towards being outcomes-based, and helps connect the distinct programs via common outcomes that reflect the agency’s values and mission.
Lessons Learned:
Logic models can help prevent mission drift. The agency-level logic model will capture outcomes that are aligned with the mission. Programs within the organization can then align with those outcomes and share evaluation measures, leveraging the broader organizational goals to guide their own success.
Using the logic model process to develop an agency-wide evaluation system elevates the value of evaluation within the organization.
Rad Resources:
WK Kellogg Foundation Logic Model Development Guide
Tearless Logic Model
Hot Tips:
Showing how a logic model tells a story can help clients understand the role and value of a logic model. Galen uses the metaphor of crossing a river. Video Clip
Even in developmental projects, it can be helpful to map the theory of change, then refine it based on what is learned.
The American Evaluation Association is celebrating Logic Model Week. The contributions all this week to aea365 come from evaluators who have used logic models in their practice. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Related posts:
SIOP Week: Dale S. Rose on Organization Development: A Program Worth Evaluating (Logically)
Logic Models Week: Tom Chapel on How Logic Models Can Be a Strategic Planning Tool
Logic Models Week: Ian David Moss on Why Logic Models Don’t Have to Suck
AEA365
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:18pm</span>
|
Introduction
SharePoint 2013 Search enables users to modify the managed properties of crawled items before they are indexed by calling out to an external content enrichment web service. The ability to modify managed properties for items during content processing is helpful when performing tasks such as data cleansing, entity extraction, classification and tagging.
The content processing component is designed over a fixed pipeline, which in turn is made of several processing stages arranged in sequence to perform distinct activities while processing a document for indexing.
While the content processing component provides several improvements over SharePoint 2010 enterprise search (not FAST Search for SharePoint 2010), it has introduced a bottleneck where custom processing is needed in the pipeline. For custom processing SharePoint 2013 has provided mechanism "Content Enrichment Web Service (CEWS) (shown as Web Service Callout in above diagram)." This is in principle a hook in the pipeline for an external WCF service.
The major two drawbacks of this process are:
Whatever custom processing we need, must be performed within this single WCF service call. There is only one registration of a CEWS allowed per pipeline (and there is only one pipeline allowed per Search Service Application). This introduced a bottleneck where we have requirements for multiple external processing of documents passing through the pipeline.
After we register the CEWS it will be applicable for all "Content Sources" in a specific Search Service Application. In practical scenario if we might have multiple Search Content Sources for a Search Service Application and have different requirements for each of the Content Sources there is no way to achieve this. This is explained below.
Content Source 1 ->required to process Managed Property values from Repository 1
Content Source 2 -> do not need to process any values from external repository
Content Source 3 -> required to process Managed Property values from Repository 2
There is no need for a Content Enrichment Web Service call for Content Source 2 and Content Source 1 and Content Source 2 needs to call two completely different repository to get the managed property values. Using a single Web Service call for both of them will not resolve the problem here. Registering a WCF Routing Service as CEWS can resolve this problem which we’ll discuss in below section.
Solution Overview
In our demo solution. There are 2 different Content Sources specific to a Search Service Application. The requirement is
We need to generate the document preview using the Longitude Preview Service from BA-Insight and at the same time we’ll populate some managed property values coming from a SQL server database for one of the Content Sources. Let’s name it "Content Source CEWS Multiple". The BA-Insight uses their own Content Enrichment Web Service to generate document preview.
For another Content Source we need to generate the document preview only. Let’s name it "Content Source CEWS Single". This only needs to call the BA-Insight preview generator Content Enrichment Service only.
Introducing WCF Workflow Service
The WCF Workflow service has the ability to call more than one WCF services. Instead of registering a simple WCF service as an endpoint for a Content Enrichment web service we can register a WCF workflow service as the endpoint and then call our custom WCF services from the Workflow Service. The WCF Workflow Service can call the BA-Insight preview generator service first to generate the preview. Then it’ll call our Custom WCF service which gets the values of the Managed Properties from SQL Server database. After getting the values, the Workflow Service will create the Output Properties and send it back to SharePoint Pipeline where SharePoint will populate the Managed Property values. But this will be applicable for both of the Content Sources which is not desired as mentioned earlier. The second Content Source "Content Source CEWS Single" needs to call the BA-Insight preview service only to generate the document previews. To resolve this we need the help of WCF Routing service which is described below.
WCF Routing Service
WCF 4.0 introduces a new service called the Routing Service. The purpose of routing service is to pick up the request from client and based on the routing logic direct the request to proper endpoints or downstream services. These Downstream services may be hosted on the same machine or distributed across several machines in a server farm. So instead of registering the WCF Workflow Service as the endpoint in our CEWS we need to register the WCF Routing Service as the endpoint for our CEWS. The SharePoint pipeline will call the WCF Routing Service during crawl with some Input and Output properties. Based on the Input property parameter the routing service will then redirect the request to either the WCF Workflow Service or the BA-Insight Preview Service. To understand in details we need to discuss some of the details on SharePoint Content Enrichment Service.
SharePoint Content Enrichment Web Service Components
Following are some key components of the Content Enrichment Web Service Parameters which can be defined during the registration of the Service.
1. InputProperties: The InputProperties parameter specifies the managed properties sent to the service.
2. OutputProperties: The OutputProperties specifies the managed properties returned by the service
Note, that both are case sensitive. All managed properties referenced need to be created in advance.
3. Trigger: A trigger condition that represents a predicate to execute for every item being processed. If a trigger condition is used, the external web service is called only when the trigger evaluates to true. If no trigger condition is used, all items are sent to the external web service.
4. SendRawData: A SendRawData switch that sends the raw data of an item in binary form. This is useful when more metadata is required than what can be retrieved from the parsed version of the item. In our case we need to set it to true since the BA
5. TimeOut: The amount of time until the web service times out in milliseconds. Valid range 100 - 30000. In our case we’ll set it to a higher value since we are using multiple services at at some point it’ll be heavily loaded.
The detailed of configuration options and Content Enrichment Web Service can be found from MSDN. Following is a sample of PowerShell script to deploy the CEWS.
$ssa = Get-SPEnterpriseSearchServiceApplication
$config = New-SPEnterpriseSearchContentEnrichmentConfiguration
$config.Endpoint = http://Site_URL/<service name>.svc
$config.InputProperties = "OriginalPath,Body"
$config.OutputProperties = "OpProp1,OpProp2,OpProp3,OpProp4″
$config.SendRawData = $True
$config.MaxRawDataSize = 8192
$config.TimeOut = 10000
Set-SPEnterpriseSearchContentEnrichmentConfiguration -SearchApplication
$ssa -ContentEnrichmentConfiguration $config
Putting It All Together
Schematic flow diagram for Overall Search Enrichment Process
Above we explained the entire logic of the Search Enrichment process through a schematic diagram.
The WCF routing service is configured as the endpoint of the Content Enrichment configuration. Only the contents in the "Content Source CEWS Multiple" Content Source needs to be updated with the managed property values from the SQL Server database, and therefore it behooves us to only forward our content processing request to the WCF Workflow Service when the document being crawled exists in the aforementioned Content Source. For documents in the other content sources, we only need to generate the document preview from BA-Insight. Therefore, instead of routing the request to the WCF Workflow Service, we are simply sending the request to the BA- Insight Longitude Preview Generation Service.
The routing service routes the request based on the routing filter. In this case the filter is configured on basis of the managed property named "ContentSource" and the value of the ContentSource. This concept can be implemented if there are more Content Sources and needs different repositories to populate the managed property values. The only thing needs to remember that the code needs to be very efficient as there are some very heavy processing involved during the processing of the documents and the SharePoint Search (noderunner.exe) Service itself is very memory hungry.
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:18pm</span>
|
The importance of traditions in employee engagement It doesn’t take a rocket scientist to figure out that when employees feel like they are part of something they contribute more than just positive energy. Their energy ripples throughout the entire framework of an organization and ultimately contributes to bottom line profitability, even though their exact effect is mostly immeasurable. This is what’s known as employee engagement, and it’s the new watchword for Human Resources professionals that are doing more than just doing their job. Going the extra mile, however, is easier said than done. The Gallup organization’s...
SHRM
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:18pm</span>
|
My name is Michele Tarsilla (@MiEval_TuEval) and I am a transformative evaluator with a focus on capacity development in international and cross-cultural settings. Having worked in 30 countries, I have become aware of the detached -and somewhat cynical- attitude that grantees organizations have towards their funder’s requirement for developing and using logic models (see the table below). As a result, the development of logic models has often been integrated acritically into organizational practices, merely as a simple "password for funding".
Source: www.keystoneaccountability.org
In response to such mechanistic use of logic models among many organizations working in international development, my effort has been to strike a balance between:
the need for accountability to my main client (e.g., the international organization asking me to work with local grantees and staff to develop a logframe and a theory of change); and
the ethical/professional (rather than contractual) obligation to be accountable to those very same local grantees and staff whose planning, monitoring and evaluation capacity development I am expected to contribute to.
Lessons Learned:
In an effort to promote a genuine understanding of how a logic model could become indeed an organizational asset (and by so doing, to enhance the ownership of both the final product and the process leading to its development), I have often asked my clients two things. First, to challenge some of those long-term goals recommended the funders and often inserted by default in the Logic Model template distributed to them). I particularly encourage them to translate those often ambiguous goals into lower-level objectives aligned with their specific vision. A small organization in Kinshasa that supported the professional development of young artists, for instance, did not see the relevance of including the Millennium Development Goal on poverty reduction -which the funder has assigned to them- as the ultimate rationale for their program in their logical framework. As a result, they replaced the goal within a different one (Increased support by the National Ministry of Culture for youth Culture and Development creations in the Kinshasa province").
Second, I invite local organizations and staff to combine the monitoring of activities and processes that funders are particularly interested in (e.g., for accountability and comparability purposes across project sites) with that of one or two additional programmatic aspects even if ignored by the funders’ guidelines. Furthermore, I push them for an ever more creative visualization of their respective programs inputs and results ("framers" will favor linear representations of program processes whereas "circlers" will be more keen at embracing a systemic and adaptive perspective of their program dynamics).
Rad Resource: For an interesting review of different logic models development processes, see Reina Neufeldt’s 2011 Handbook on "Frameworkers" and "Circlers"
The American Evaluation Association is celebrating Logic Model Week. The contributions all this week to aea365 come from evaluators who have used logic models in their practice. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Related posts:
Logic Models Week: Debra Smith and Galen Ellis on How Logic Models Can Be Used to Develop Evaluation Systems
Charles Gasper on Logic Models
Michael Duttweiler on Talking Your Way Into a Logic Model
AEA365
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:18pm</span>
|
Overview
When I am exploring the breadth of SharePoint 2013 PowerShell and would like to dig a little deeper to understand what each of the commands are doing and what set of command are available for given operation or area of SharePoint functionality, or would like to debug some of the run time issues you may incur, I like to explore with the .Net Reflector.
Here I wanted to provide you with some guidance on your way around exploring the SharePoint 2013 PowerShell.
Setup
First of all, of course you will need to have your SharePoint 2013 On Prem installed and an existence of live farm .
You could use any commercial available Obfuscator tool. I use Red gate .NET Reflector.
Exploring
Run the .Net Reflector.
Locate the Powershell assembly as below. With .Net 4, the assembly location is now located under C:\Windows\Microsoft.NET\assembly
For powershell assembly locate the following folder
C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.SharePoint.PowerShell\v4.0_15.0.0.0__71e9bce111e9429c\Microsoft.SharePoint.Powershell.dll
Once you have loaded the PowerShell assembly, you will see following name spaces
In order to look up the usual SharePoint Command Lets such as Get/Set/New/Delete/Remove commands look below under the SPCmdlet groups to locate your desired command
Lets explore one of the commandlet we all use Add-SPSolution.
So locate the SPCmdLetAddSolution as below:
On the Details pane you should see the method partial details as below
Now click on Expand Methods to further expand the details of each method. Partial view below:
Here under the SPSolution method you will see that there is call for _LocatFarm object which is based on the SharePoint Object Model.
So let’s click on the add method, and you will see that the call is pointing to the object model, and there you further explore the real call details.
Conclusion
Likewise, being able to explore the specific calls by knowing your API calls, you can further diagnose your Farm Issues, Configuration Issues and Development issues.
Netwoven
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 27, 2015 12:18pm</span>
|