Loader bar Loading...

Type Name, Speaker's Name, Speaker's Company, Sponsor Name, or Slide Title and Press Enter

In 2012, California passed a law requiring companies with five or more employees to offer retirement plans. The California Secure Choice Retirement Savings Trust Act is expected to be implemented by... Visit site for full story...
TriNet   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:46pm</span>
Overview You have SharePoint 2010/2013 Farm deployed. In the same SharePoint Farm, you are trying to back up a site collection and restore on to either a different site URL or managed path or different web application.  You are doing this using following PowerShell commands: Backup-SPSite Restore-SPSite Meanwhile in the process, your Backup-SPSite will work successfully, but while executing the Restore-SPSite you might get following errors: Restore—SPSite : The operation that you are attempting to perform cannot be completed successfully. No content databases in the web application were available to store your site collection. The existing content databases may have reached the maximum number of site collections, or be set to read—only, or be offline, or may already contain a copy of this site collection. Create another content database for the web application and then try the operation Restore-SPSite : Access is denied. (Exception from HBESULT: 0×80070005 ( E_ACCESSDENIED)) Cause Condition for Site Restore When you back up a site collection and restore within the same Web Application, you will need a separate Content Database to restore to. Refer this MSDN Article. Same site collection (will have same SiteID GUID) If a site collection is backed up and restored to a different URL location within the same Web application, an additional content database must be available to hold the restored copy of the site collection. Condition for Site Delete Now if you are backing and restoring to be served from different URL, then you are looking at deleting the site collection before you are restoring the site collection. Starting in SharePoint 2010, the Site Collection deletion process has been changed. Refer to Bill Baer’s blog here. Condition for Site Restore When you are set with above conditions and trying to restore site collection now you are greeted with the Access Denied warning. This happens when the user login you are using to login to the SharePoint Server (RDP), this is usually the Farm Admin, is not one of the restoring site collection primary or secondary administrator. Resolution You are going to be performing this operation on the SharePoint Server, using the Farm Administrator account. Your current site collection owner is not the farm admin. In this case, set your farm admin as Primary/Secondary administrator (This will be temporary until your restore operation is complete) If you did not do this and went ahead and already deleted the site collection and you have no back up, still no worries, follow further steps. But at least know who is the Primary or Secondary site collection administrator. Backup your site collection using the Backup-SPSite. If you need to delete the site collection because you are trying to move the site collection, then you will need to delete the backed up site collection from SharePoint. So before you go ahead and delete, ensure you have backed up the Content Database that contains this site collection. If you delete site collection using Remove-SPSite, the site collection will be deleted permanently. If you delete site collection from the Central Administration or using   Remove-SPSite with -GradualDelete option the site collection will be marked for deletion, and will be deleted based on the timer job "Gradual Site Delete". You could list the sites marked for deletion by running Get-SPDeletedSite and follow with Remove-SPSite You could force the timer job Run Now to delete pending the site collection that are marked for deletion At this stage your site is removed successfully. Now you are trying to restore your site collection Use Restore-SPSite PowerShell command to restore. If you are coming across the Access Denied error, that means your Site Collection Primary or Secondary owner is not one of the Farm Admin. In that case, ( If you did not rest the site collection Primary or secondary admin to the Farm admin before the backup), follow below option: Find out who was the primary/secondary site collection administrator Temporarily, make the above site collection administrator a farm admin, provide rights (I gave full rights until Restore was successful) to the restoring Content Database. Have the site collection administrator Login to the SharePoint Server farm (This is the only way, because you did not reset the site collection owner to farm admin before the backup) Then run the Restore-SPSite Now remove the user from farm admin and remove the SQL Full rights on the content database If you had the Site collection administrator reset to Farm admin prior to back up, then set the appropriate business users to the site collection administrators
Netwoven   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:46pm</span>
This post is part of TriNet’s ongoing series about the Affordable Care Act and its effects on small business. The Affordable Care Act (ACA) is here to stay and any business owner with at least one... Visit site for full story...
TriNet   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:45pm</span>
We’re now in the home stretch of my tips for harassment policies and complaint procedures. (It’s not exactly like America Pharoah’s "home stretch" run for the Triple Crown - but close!) "Pregnancy" as a Protected Characteristic My first tip creates a very cathartic experience because it forces me to admit that nobody’s perfect, including me.  In my case, the "no-harm/no-foul" rule saved me.  But, there was once a client who wasn’t as lucky. The client had an EEOC pregnancy discrimination charge and, lo and behold, the long list of protected characteristics in the company’s harassment policy did not include...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:45pm</span>
Overview While building Public site in other terms we call Customer Facing site, you will need better Web Content Management capabilities. Generally you will have Authors who are responsible for producing the content while the produced content will go through an approval process before the content becomes visible on the Public Site. Generally the Public Sites are also Graphics heavy where you also want to better manage the graphics or digital assets. SharePoint 2013 provides a new feature called Cross Site Publishing capabilities.  Utilizing this new feature, you can design more efficient Public Site. In this blog I will discuss about Cross Site Publishing and with an example of the publishing site implementation. Cross-Site Publishing Cross-Site Publishing is very simple publishing is a feature driven method that lets you create and maintain content in one or more authoring site collections or web application and publish or display this content in one or more publishing site collections by using content Search Web Parts. Cross-site publishing complements the already existing publishing method, author-in-place, where you use a single site collection to author content and make it available to readers of your site. In a simple manner you write content in one place and have it published somewhere else. This is the visual presentation by Microsoft: How the cross-site publishing works? Cross-site publishing uses search technology to retrieve content. On a site collection where the Cross-Site Collection Publishing feature is enabled, libraries and lists have to be enabled as catalogs before the content can be reused in other site collections. The content of the library or list catalogs must be crawled and added to the search index. The content can then be displayed in a publishing site collection by using one or more Search Web Parts. When we are changing the content in an authoring site collection, those changes are displayed on all site collections that reuse this content, as we are using continuous crawl. Why you use the cross-site publishing for your website? In the scenario, where content authors can add content in a controlled environment, meaning they can add the content in authoring site which is AD authenticated. Let us name it http://NWAuthor. This content is shared or displayed in Published sites web application which is configured to allow anonymous access for external users. Let us name it http://NWPublish. At the same time you create another web application where you can store your site assets like images and videos. Now this web application allows read access to anonymous user while it requires authentication for modifying/adding contents. Let us name it http://NWAssets. Let me share the high level Architecture below: Now we will go through step by step process. The first thing we have to do is to create three site collections.  Step#1 Create Authoring Site Firstly, we will create a new web application with windows Authentication Now we will create an authoring site collection. To create the Site Collection, we need these details A title for the website, which is "NWAuthor", or any name as you like. The website’s URL. Select 2013 for the experience version. From the Publishing tab, select the Team Site template. In the field, Primary Site Collection Site Administrator, enter the site admin’s user name. We’ll create a Site Collection based on Team site template. The ideal Situation is choosing "Product Catalog" template. By choosing "Product Catalog" template, SharePoint will create following artifact for site collection: Activate Cross Site Publishing Feature Create Product List associated with content type Product with Image Create two content types, Product and Product with Image Create following Site Columns Group Number Item Category [Managed Meta data column linked to Product Hierarchy term set Item Number Language Tag Create Site Collection Term Set "Product Hierarchy" As we only require "Cross Site Publishing Feature", we are choosing "Team Site" template and then we are going to activate this feature. Step#2 Create the publishing site Create another web application with "Anonymous access" and name it "NWPublish" Select the "Entire web site "option from Anonymous access Next we create the root site collection based on "Publishing Portal" Step#3 Create the Asset library  Setup an Asset Library. This can exist anywhere as a container to store site blobs (Pictures, videos, pdfs, etc.) in this case Asset library exists in its own web collection with URL (http://NWAssets). Again this web application has "Anonymous access" for lists and libraries only. Let’s create a picture library with name "NWassets" and upload the images here. Coming to the specific list we are going to store the blobs, in this case "NWassets", we’ll check the "View only" option. So as of now we have created our 3 web application Step#4 Create content type & list  Authoring site is ready. Let’s create the list which holds the contents of the site. Create a list with name "websitecontent". Now go to the list setting, then catalog setting. Under catalog setting, check "Enable this library as catalog". It will be good to keep the number of list in your author site minimum as this will help you to maintain the site. Otherwise you have to make all the list as catalog then make all the catalog connection from the publishing site. Now what is Catalogs? A catalog is a list or library that is shared out to search for consumption on publishing sites. Catalogs enable content to be published across site collections—the cross-site publishing features depend on catalogs. Any List/Library can be marked as catalog by going to Catalog Settings page and selecting "Share the list as a catalog for other sites and collections". After you connect a publishing site to this catalog, the fields that you specified as catalog item URL fields appear as part of the friendly URL, but we do not require this. Step#5 Adding Images in Asset library  Before adding list items, we will add relevant images in the asset library. Open the asset URL i.e. http://NWAssets and add images there. So we see the image first uploaded to asset library, and then that URL refer to the author list. Next we will go to the Publishing Site, and "Manage the Catalog connection" there.  Step#6 Start adding content You have already created the list in your author site, now it’s time to add content there. Step#7 Manage Catalog Connection Once we enter content in the Author site and make the list as catalog, our next task is to run a full crawl. After the full crawl is successfully completed, from the site setting of the publishing site, click on Manage Catalog Connection. Create the catalog connection. Once you click on "Manage catalog connections", below screen pops up. Click on the connect link. While creating the connection choose "Connect, but do not integrate the catalog". Step#8 Start creating pages Once the content is ready for display in pages, start creating pages in the publishing site. Create a custom page in the publishing site and add a content search web part. Now CSWP is ready to use catalogs! Edit the CSWP and click on "Change Query". Go to Advance mode and under "select query" option, select the "Author - Website content result". Close the web part and Check in the page. Now your page is ready with content and image. Here I have created a custom display template to show the content & image. You can create a new display template and whatever look & feel you want for your content you can work here. Conclusion This is a simple presentation of using cross-site publishing to create public site. Main advantage of cross site publishing is to create content one time and publish in many places and with anonymous access any one can view the publishing site! Another advantage is with the help of display template you can display the same content in different way in different places. In the next article I will discuss about display template and managed navigation that we can use in cross site.
Netwoven   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:45pm</span>
5 reasons CFOs are interested in workforce analytics & automation Identifying empty labor On average US employees waste 2 hours a day beyond breaks and lunch hour.  However, most companies only have self-reporting methods to track the amount of work and time spent on various tasks.  WorkiQ provides real-time collection and reporting thus revealing instant performance measurement of both in-house and remote employees.    Identify areas for reduction of overtime According to a recent survey, average Americans work an hour of overtime each week.  Sometimes the business may need overtime to get through peak periods but how do you truly know without accurate data?  WorkiQ provides real-time data showing if empty labor is a potential symptom for excess overtime.  The solution can identify the amount of time spent on productive and non-productive activities and categorize the type of work that consumes the most labor hours. Workforce analysis Do you have the right amount of people assigned to the appropriate inventory of work?  How many people do you need to handle open enrollment this year?  Take the guessing out of staffing;  WorkiQ provides data on actual activity and work productivity, giving you true FTE analysis to insure you have the team for the workload. Real-time and accurate performance data Identifying top performing teams and individuals is critical to building a culture of accountability and high employee engagement.  Real-time workforce analytics provides the operational intelligence necessary to evaluate true staffing needs, reduce outsourcing, and lower the overall costs of operations. By providing real-time dashboards with insights into the actual performance at any given moment your managers will be empowered to make "in the moment" coaching and guidance for optimal performance. Spotting process deficiencies Organizations can’t always see the different steps it takes to process a transaction within their operations.  By employing analytics to visualize the work path, you can identify the critical details needed to reduce bottlenecks.  Then these processes can be eliminated or automated through WorkiQ Robotic Process Automation. The post Workforce Analytics for the CFO appeared first on WorkiQ Blog.
WORKIQ   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:44pm</span>
Overview Beginning with SharePoint 2013 workflows are fully declarative irrespective of whether they are designed using SharePoint Designer or Visual Studio. Declarative indicates that no longer the workflows are authored in code and then compiled into managed assemblies. Instead workflows are described in Extensible Application Markup Language (XAML) and interpreted at the execution time for activities and sequences. Being the native building blocks, XAML representation of workflows comes real handy for developers in situations like below: Advanced debugging of workflow Copy of workflow between sites with some modification Integrating workflow in SharePoint app where workflow is not necessarily attached to any list within the app. However while working with SharePoint Designer 2013 there is no direct method to export the workflow XAML and it is not intuitive. In this article we will explore one undocumented method to export a workflow in its XAML form. The Process Launch the SharePoint Designer 2013 and connect to a SharePoint 2013 site. The site can belong to On Premise SharePoint 2013 installation or to Office 365. Once connected select an already existing workflow or create a new workflow. Figure 1: Select a Workflow Next save the workflow as template by clicking on the "Save as Template" button in the ribbon. This will save the workflow as WSP package and save in the Site Assets library under the current site. The file will be saved by the same name as that of the workflow. Figure 2: Save as template Open the site in the browser and select "Site contents" from under Settings icon.  Then select Site Assets from the app list. You should be able to find the WSP file in there. Figure 3: Workflow WSP file in Site Assets Download a copy of the file to your local desktop. Then change the extension of the file from WSP to CAB.  This will change the file icon to that of a zip file. Extract the file using Winzip or WinRAR. Navigate within the extracted folder and you will find a folder structure similar to as shown below: Figure 4: Workflow.xaml location You will find the workflow.xaml file in the location as shown in the picture Figure 4. You can open up the XAML file in any regular XML editor for review and update. Figure 5: Workflow.XAML Happy coding!!
Netwoven   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:44pm</span>
Hi, I’m Janet Usinger, another of the co-leaders of the Qualitative Methods TIG, and a co-editor with Leslie Goodyear, Jennifer Jewiss, and Eric Barela of a new book about qualitative evaluation called Qualitative Inquiry in Evaluation: From Theory to Practice (2014, Jossey-Bass). The process of interviewing participants in an evaluation shares a few characteristics with counseling sessions. Establishing rapport between the interviewer and interviewee is essential to gather meaningful data. Evaluators generally enter the interview session with confidence that a constructive conversation can be launched quickly. There are times, however, when the evaluator finds him or herself at odds with what the interviewer is saying. Sometimes the tension is because there is a philosophical difference of opinion; other times, it is just that the two individuals do not particularly like each other. I have had several experiences interviewing adolescents (and adults) who simply pushed my buttons. Yet removing the individual from the study was inappropriate and counterproductive to the goals of the evaluation. Hot Tip: Put on your interviewer hat. Your responsibility is to understand the situation from the interviewee’s perspective, not get caught up in your feelings about their statements. Hot Tip: Be intensely curious about why the person holds the particular view. This can shift the focus in a constructive direction and deepen your understanding of the interviewee’s underlying experiences and perspectives of the issue at hand. Hot Tip: Leave your ego at the door. Remember, it is their story, not yours. Lesson Learned: Once I took my feelings out of the equation, interviews with people with whom I do not click have become some of the most meaningful interviews I’ve conducted. This is not necessarily easy, and I generally need to have a little private conversation with myself before the interview. However, once I do, I am able to dig deeper in trying to understand their perspectives, frustrations, and worldviews. Rad Resource: More stories about being in the trenches of qualitative inquiry in evaluation can be found in the final chapter of our new book, Qualitative Inquiry in Evaluation: From Theory to Practice (2014, Jossey-Bass). The American Evaluation Association is celebrating Qualitative Evaluation Week. The contributions all this week to aea365 come from evaluators who do qualitative evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Stefanie Leite on Interview Tips for Job Seekers in Evaluation Dreolin Fleischer on Organizing Quantitative and Qualitative Data Stefanie Leite on Building Rapport during Telephone Interviews
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:44pm</span>
                    Q:  My wife interviewed for a job recently and I was amazed that one of the questions she was asked by the interviewer was "Do you have any children?"  We do, but what does that have to do with her qualifications for the job?  Plus, I thought it was illegal to ask those questions.  A:  Actually, it is not technically illegal to merely ask the question.  What’s illegal is to base a hiring decision on the answer to the question. So let’s say your wife...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:43pm</span>
HR Managers can increase their strategic business relevance by providing analytics that are meaningful, comparable, and actionable in a timely manner. This requires evidence-based thinking, reliable data science, and meaningful analytics that directly address real business value. In this presentation delivered at Workforce & HR Analytics Summit West 2015, Edward M.L. Peters, CEO at OpenConnect, illustrates how HR Managers can provide strategic analytics to their business partners. HR Analytics Expert Panel Q&A : Getting company buy-in for HR analytics Recorded at The Workforce & HR Analytics Summit West 2015, this expert panel discusses how to get company buy-in for HR analytics. During this Q&A session, the audience has a chance to engage the speakers directly. The HR analytics panel includes: Annette Blount Senior Manager, Workforce Analytics and Global Reporting, Wyndham Worldwide Suzanne Bell Director, HR Strategy, Talent Planning & Change Management, Toyota Financial Services Edward M.L. Peters, Ph.D, Chief Executive Officer, OpenConnect Request WorkiQ Demo The post CEO Talk : Increasing Your Strategic Relevance Through HR Analytics appeared first on WorkiQ Blog.
WORKIQ   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:43pm</span>
Overview You are developing an SSIS package. You want to work with a record in a Script Task. This blog discusses how you can get access to the Records in the Script task using the Object Type variable. Steps 1. Go to Start &gt;&gt;All Programs&gt;&gt; Microsoft SQL Server 2012&gt;&gt; Click on SQL Server Data Tools 2. Visual Studio &gt;&gt; File &gt;&gt;New &gt;&gt; Click on Project 3. Select Business Intelligence Under Installed Templates &gt;&gt;Integration Service Project Click Ok Button 4. Now you can see inside Visual studio &gt;&gt; solution Explorer &gt;&gt; Package.dtsx 5. Double click on Package.dtsx file and you can see the design editor of that package 6. Now go to the SSIS tool box  and add (either drag and drop or double click) a Data flow task controls inside the control Flow Tab 7. Add another control script Task from SSIS toolbox to Control flow tab 8. Create a connection between Data Flow Task and Script Task using the down side arrow of Data Flow Task 9. Now we need to add an object type variable, so go to the right most side of the Package Design Editor and click on Variables 10. It will open a Window, Bottom side of your visual studio 11. Click on Add Variable 12. Give a Name to the variable and Select Data type as Object  13. Double click on Data Flow Task this will bring you data flow tab 14. We can use any kind of data source like Flat File Source, Excel Source, OleDb Source etc. I have Used OleDB Source, as per project requirement. So just drag and drop the OleDB source control from SSIS tool box to Data Flow Tab 15. Double Click on OleDB Source control and we will get the oleDB Source editor. Inside the Connection Manager we have to set the OleDB connection Manager. 16. Click on the New button and we will get the Configuration ole DB Connection Manager Window 17. Again, Click New button, put the server Name and Select the database Name in the Connection manager window and click ok. 18. Now select the Table Name 19. We can choose selected columns from the selected table, just go to the columns tab and unselect the checkbox beside the column and click ok. 20. Add another control into the Data Flow Tab called Record set destination 21. Connect the Ole DB source with Record set Destination using the bottom side of Ole Db source control. 22. Double click on the Record set Destination , we’ll get the Advance editor for record set destination, inside the component properties tab there is a custom property called Variable Name.  Select the object type variable (that we have previously created). 23. Go to the next tab called Input columns and choose the columns as per your requirement, also we can set the usage type whether the columns are read only or read write type. After all required settings click OK. 24. Go to the control flow tab back and double click on the Script Task control. And set the Object type variable that we had created as Read Write variables. 25. Click on the Edit Script Button , this will open another project called Vista Project, Inside that project ScriptMain.cs is the class file, where we have to add some codes to access the Object type variable that we have passed from our SSIS package. 26. We’ll add a namespace for ole DB data adapter using System.Data.OleDb; And add some lines of code inside the main method or any other sub method OleDbDataAdapter da = new OleDbDataAdapter(); DataTable dt = new DataTable(); da.Fill(dt, Dts.Variables[0].Value); 27.  Finally we can get all the records passed from our SSIS package inside the data table object. Output
Netwoven   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:43pm</span>
Hello from snowy Boston! I’m Leslie Goodyear, one of the co-leaders of the Qualitative Methods TIG, and a co-editor, with Jennifer Jewiss, Janet Usinger and Eric Barela, of a new book about qualitative evaluation called Qualitative Inquiry in Evaluation: From Theory to Practice (2014, Jossey-Bass). When I was a new evaluator, I had a major "a-ha experience" while interviewing a group of women who participated in an HIV/AIDS training for parents. They were bilingual Spanish-English speakers, and I was definitely the least fluent in Spanish in the room. As they discussed ways in which HIV could be transmitted, one woman referred to a specific sexual activity in Spanish, and all the others laughed and laughed. But I didn’t know for sure what they meant; I had an idea, but I wasn’t sure. Of course, I laughed along with them, but wondered what to do: Ask for them to define the term (and break the momentum)? Go with the flow and not be sure what they were talking about? Well, I decided I’d better ask. When I did, and the woman said what she meant, another woman said, "Oh, no! That’s not what it means!" She went on to explain, and the next woman said she thought it meant something else. And on and on with each woman! It turns out that none of them agreed on the term, but they all thought they knew what it was. Lesson Learned: Ask stupid questions! I was worried I would look stupid when I asked them to explain. But in fact, we all learned something important in discussing the term, but also in talking about how we can think we all agree on something, but if it’s not clarified, we can’t know for sure. Lesson Learned: Putting aside ego and fear are critical to getting good information in qualitative evaluation. Often, stupid questions open up dialogue and understanding. Sometimes they just clarify what’s being discussed. Other times, even though you might already know the answer, they give participants an important opportunity to share their perspectives in greater depth. Rad Resource: More stories about being in the trenches of qualitative inquiry in evaluation, and asking stupid questions, can be found in the final chapter of our new book, Qualitative Inquiry in Evaluation: From Theory to Practice (2014, Jossey-Bass). The American Evaluation Association is celebrating Qualitative Evaluation Week. The contributions all this week to aea365 come from evaluators who do qualitative evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: QUAL Eval Week: Leslie Goodyear, Jennifer Jewiss, Janet Usinger, and Eric Barela on The Role of Context in Qualitative Evaluation QUAL Eval Week: Leslie Goodyear, Jennifer Jewiss, Janet Usinger and Eric Barela on Qualitative Inquiry in Evaluation QUAL Eval Week: Michael Quinn Patton on Purposeful Qualitative Sampling
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:42pm</span>
Dark Events, or work steps that normally go uncaptured, can dramatically change critical business decisions or cause a security risk. The discovery, collection, and analysis of Dark Events is necessary for reliable process analytics and operational intelligence. In this presentation delivered at Analytics for Insurance, USA, Edward M.L. Peters, CEO at OpenConnect, explains how uncovering Dark Events has helped insurance companies recover millions in lost business value. Although examples are based on insurance claims, this presentation provides actionable insight to anyone with a focus on business process improvement, six sigma, and business intelligence. Request WorkiQ Demo The post CEO Talk : Are Dark Events Distorting Your Business Analytics? appeared first on WorkiQ Blog.
WORKIQ   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:42pm</span>
  It is starting to feel like we are truly lying to rest #HRYesterday. The paper pushing, request based, reactive and anti-tech savvy form of operating is our past. Rejuvenating! Last week, I had the pleasure of getting coffee with Chris Rutter, 33, CHRO of Matrix Systems Holding, LLC. in Columbus, OH. Impressed yet?! Please let me continue! We discussed everything from being a young leader to workplace equality. Chris was dressed in denim shorts cuffed at the knee, thong sandals, backpack and a tattoo gracing his right forearm that read "familie" in a thick brush script. More casual than...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:41pm</span>
Overview In SharePoint the chrome is defined by the master page which in turn defines the overall layout, core styling, page behavior, location and size of the content area and includes any common controls shared across pages. The content area is defined by the content page which in turn inherits style and behavior from the master page and interacts with controls on the chrome. Minimal Download Strategy is a new feature in SharePoint 2013 that improves client rendering performance and fluidity when navigating from page to page by download only the changes between two compatible pages. Fewer bytes will be downloaded and the page will appear more quickly. It reduces the amount of markup, CSS, scripts, etc. that the browser needs to parse and render improving overall performance and provides smoother transitions. Despite the MDS, a change as simple as a custom color applies differently to the three types of sites - Project, Publishing and Team. Issue While on one hand, applying custom colors to theme for sites is not an available option, the other issue is to modify theme color for all subsites of a site or all sites within a site collection. Solution The following processes sequentially walk through modifying the theme color to a custom color using Microsoft SharePoint Color Palette Tool and then modifying site or site collection settings to apply the custom color throughout. Modifying Theme Color 1. It is of foremost importance to identify the theme so that changes can be made only at the required place. The first step is to check access to Site Settings &gt;&gt; Composed Looks. Look for the Current item name in the Name column. The .spcolor file can be downloaded by clicking on its name (in the Theme URL column of the corresponding item). Note this file path as it will be handy later. Note: To go to Composed Looks, if the site is at the root site collection, modify URL as http://domain/_catalogs/design/AllItems.aspx, else, http://domain/sites/&lt;site-collection&gt;/&lt;site-name&gt;/_catalogs/design/AllItems.aspx where, domain is your domain; replace &lt;site-collection&gt; and &lt;site-name&gt; accordingly. 2. Open the downloaded .spcolor file in the tool as in the below screen shot to display the color palettes specified in the existing file. 3. Click on the color picker (as highlighted in the below image) and fill in the R, G, B and Opacity with the desired values. Once filled, the color picker gets modified to the desired color as shown below. 4. For applying the effect, click on Recolor beside the color picker and the master page is modified with tones of the desired color as shown below. 5. Save the changes in the .spcolor file with another name. The new name should be carefully chosen as to not overwrite any of the available files. 6. To put the new .spcolor file into effect, the new file has to be first uploaded to the Theme Gallary &gt;&gt; 15. Note: While Composed Looks exists individually for each site, Theme Gallary is unique to a site collection. If you can’t find ‘Themes’ from the site settings, you could use this URL http://domain/sites/&lt;site-collection&gt;/_catalogs/theme/15/ to directly get to the library. While modifying at the root site collection, the Theme Gallary is accessible directly at the root like - http://domain/ _catalogs/theme/15/  Click on new document and upload the file as shown below. 7. Once uploaded, copy the file URL from the menu options by click on ‘…’ 8. Now that the modified file exists in the library, note the Master Page URL of the Current item to add a new item to Composed Looks for the desired theme color. Add the new theme color as a New Item in the Composed Looks. Fill in the required fields. Enter the Master Page URL pointing to that of the Current item and the Theme URL points to the newly added .spcolor file. Upon saving the custom theme color becomes available as an item in the Composed Looks, which in turn is an option in Change The Look. 9. The final step is to select and apply the modified theme from the Change The Look option. Click on Try it out and then opt for either Yes, keep it to apply or No, not quite there to exit. Branding of Sites with Custom Color Nature of Sites Inheritance exists differently is the Publishing, Team and Project Site type site collections. To apply branding, it is required that inheritance is tweaked through workarounds. Publishing Site By virtue of its configuration, subsites of the type Publishing Site inherit theme of the parent site of the same type. The theme applied to the site collection root site descends to all the existing subsites and also to the new subsites created after the theme is applied. This phenomenon ceases to exist beyond the node (site) where inheritance is broken, thereafter which all subsites inherit from their immediate parent. Team Site Any subsite on a team site type site collection assumes the default color. Changing the theme color is local for a particular site and does not impact existing or new subsites within it. The same pattern is observed on theme color change on the site collection root site. However, if the theme color characteristics of a publishing site is required along with the characteristics of a team site, this can be easily done on the root site of the Team Site type site collection by activating the SharePoint Server Publishing Infrastructure from Site Settings &gt;&gt; Site Collection Administration &gt;&gt; Site collection features Project Site A Project Site behaves similar to a Team Site, wherein a theme color changes is only local to a site. Similarly, SharePoint Server Publishing Infrastructure should be activated at the site collection level to achieve inheritance. Workaround for Inheritance This workaround is intended for Project and Team sites type site collections where Master page will not be still a direct option under Site Settings &gt;&gt; Look and Feel as in the Publishing Site. To the site collection URL in the address bar, append _layouts/15/ChangeSiteMasterPage.aspx to reach to Site Master Page Settings At the root site of the site collection, the Inherit the theme from the parent of this site option remains disabled. However, once the feature is activated, Reset all subsites to inherit the theme of this site option is available under Theme Before After To apply theme color of the root to all sub sites, select Reset all subsites to inherit the theme of this site. This will ensure that all existing subsites are overridden to the root theme color. Limitation This workaround however does not apply to new sites created at any level underneath the site collection. Hence, any newly created site will not assume the custom color of the site collection. This can be checked on the Site Master Page Settings for subsites (by appending _layouts/15/ChangeSiteMasterPage.aspx to the site URL in the address bar) Existing Subsites New Subsites Workaround A manual selection of Inherit the theme from the parent of this site paints the newly created subsite with the site collection theme color. This manual process has to be meticulously performed on all new subsites, unless inheritance is broken thereby creating a node below which the subsites inherit from their immediate parent site, until the node is reached.
Netwoven   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:41pm</span>
My name is Michael Quinn Patton. I train evaluators in qualitative evaluation methods and analysis. Qualitative interviews, open-ended survey questions, and social media entries can yield massive amounts of raw data. Course participants ask: "How can qualitative data be analyzed quickly, efficiently, and credibly to provide timely feedback to stakeholders? How do every day program evaluators engaged in ongoing monitoring handle analyzing lots of qualitative responses?" Hot Tip: Focus on priority evaluation questions. Don’t think of qualitative analysis as including every single response. Many responses aren’t relevant to priority evaluation questions. Like email you delete immediately, skip irrelevant responses. Hot Tip: Group participants’ responses together that answer the same evaluation question even if the responses come from different items in the interview or survey. Evaluation isn’t item by item analysis for the sake of analysis. It’s analysis to provide answers to important evaluation questions. Analyze and report accordingly. Hot Tip: Judge substantive significance. Qualitative analysis has no statistical significance test equivalent. You, the evaluation analyst, must determine what is substantively significant. That’s your job. Make judgments about merit, worth, and significance of qualitative responses. Own your judgments. Hot Tip: Keep qualitative analysis first and foremost qualitative. Ironically, the adjectives "most," "many," "some," or "a few" can be more accurate than a precise number. It’s common to have responses that could be included or omitted, thus changing the number. Don’t add a quote to a category just to increase the number. Add it because it fits. When I code 12 of 20 saying something, I’m confident reporting that "many" said that. Could have been 10, or could have been 14, depending on the coding. But it definitely was many. Cool trick: Watch for interoccular findings — the comments, feedback, and recommendations that hit us between the eyes. The "how many said that" question can distract from prioritizing substantive significance. One particularly insightful response may prove more valuable than lots of general comments. If 2 of 15 participants said they were dropping out because of sexual harassment, that’s "only" 13%. But any sexual harassment is unacceptable. The program has a problem. Lesson Learned: Avoid laundry list reporting. Substantive significance is not about how many bulleted items you report. It’s about the quality, substantive significance, and utility of findings, Lesson Learned: Practice analysis with colleagues. Like anything, you can up your game with practice and feedback, increasing speed, quality, and confidence.           Rad Resources: Goodyear, L., Jewiss, J., Usinger, J., & Barela, E. (Eds.), Qualitative inquiry in evaluation: From theory to practice.Jossey-Bass. Patton, M.Q. (2015) Qualitative Research and Evaluation methods, 4thSage Publications. The American Evaluation Association is celebrating Qualitative Evaluation Week. The contributions all this week to aea365 come from evaluators who do qualitative evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Cultural Competence Week: Osman Ozturgut, Tamara Bertrand Jones, and Cindy Crusto on Cultural Competence in Evaluation Dissemination Working Group: Teaching in the 21st Century Jacquelyn Christensen on Wordle and Survey Anchors Heather Bennett on Before the coding begins…
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:41pm</span>
5 reasons HR is interested in desktop analytics Accurate and real-time performance measurement Identifying top performing teams and individuals is critical to building a culture of accountability and high employee engagement. WorkiQ provides the operational intelligence necessary to evaluate true staffing needs, reduce outsourcing, and lower the overall costs of operations. By providing real-time dashboards with insights into the actual performance at any given moment your managers will be empowered to provide guidance for optimal performance. Increase employee engagement Companies with "Highly Engaged" employees outperform other companies by 23%. Identifying areas based on true performance measurement to reward top employees is difficult for most companies. Real-time workforce analytics provides the ability to see in real-time top performers. With this info companies can introduce gamification, leader boards, and rewards systems to encourage new levels of engagement. Identifying hidden potential Analytics can be used to identify high-performing teams and individual team members. Analyzing patterns of successful work results enables companies to spot individuals who outperform their peers, utilize their time efficiently, and analyze the best use of business applications to complete the job at hand. Identify coaching opportunities On average US employees waste 2 hours a day beyond breaks and lunch hour. However, most companies only have self-reporting methods to track the amount of work and time spent on various tasks. Real-time collection and reporting reveals instant performance measurement of both in-house and remote employees it allows for "in the moment coaching" opportunities and significantly recaptures empty labor hours. Sourcing Big Data for workforce analysis Do you have the right amount of people assigned to the appropriate inventory of work? How many people do you need to handle open enrollment this year? Take the guessing out of staffing; WorkiQ provides data on actual activity and work productivity providing true FTE analysis insuring your company has the right size team for the workload. The post Desktop Analytics for HR appeared first on WorkiQ Blog.
WORKIQ   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:41pm</span>
Overview You have SharePoint 2010 or 2013 environment.  You have also deployed My Sites and your users have been consuming the My Site Feature. You had an old Users OU in your AD and now you decided to create a new OU for whatever reason. Then gradually you have been moving the users from old OU to the new OU. While this moving users has been occurring for period of time, now few end users are receiving a warning email message that one of their reporting employee my site will be deleted. This issue is around the below described specific scenario and the suggested resolution could be reviewed in your specific scenario based. Issue Diagnosis Let’s say you have following below OUs for in your AD. OldUsers is the old OU New Users is the new OU In the last one month your IT has been gradually moving users from OldUsers to New Users OU. In the SharePoint, the User Profile Synchronization is pointed only to the OldUsers OU. As the users were moved to the new OU New Users, following is been happening: The moved User Profile was deleted from SharePoint. If that user had a mysite and if the user had an assigned Manager in AD, Then the MySite secondary administrator was changed to the Manager The My Site was queued for Deletion in 14 Days. An email was sent to the Manager (This is first warning) After 12 days a second email was sent with notification that after 3 days the site will be deleted. If that user had a mysite and if the user had no Manager in AD, No email sent to any body No change in user administrator The site is queued for deletion in 14 days The entire process is managed by a timer job called "MySite Cleanup job".  This timer job runs every day from 1AM-6AM (Default setting). Resolution First and foremost disabled this timer job "MySite Cleanup job" to avoid any accidental My Site Deletion. The resolution is to gracefully recover from this issue and not directly edit the database table content which is not supported by Microsoft. So proceed with your own caution. You will first need to determine the current state of User Profiles in the SharePoint Profile Database. Run the following simple Select query against your SharePoint Profile database SELECT [DisplayName] ,[Email] ,[NotificationStatus] ,[Created] FROM [PROD_Profile].[dbo].[MySiteDeletionStatus] (For privacy reasons, the names and emails have been removed from the below screenshot) For each user, against the Created Date, calculate the 14 days date. In the SharePoint farm, under UPS revise the Connection to ensure to include the New User and OldUsers are also included Run full profile synchronization Ensure that now you can find all the users (or whatever the latest number from the above SQL query run) under the User Profiles Under the Monitoring, ensure the MySite Cleanup job is now active and is scheduled to run daily (Default) Ensure that the daily database backups are done Next step is to watch out for each of the marked user site collections to be deleted (The Site Collection does not get permanently deleted, with SP 2010 SP1 changes, the site collection now stays in the Site Collection Recycle bin at the Farm level) by the MySite Cleanup job, on the dates that you calculated by adding the 14 days. After a site is deleted by the MySite Cleanup job, next day, re-run the SQL Query to ensure that the user name is not listed in the SQL Query. Now run the following PowerShell command to check and restore the deleted site collection from the internal site collection recycle bin. On any server in the farm, run the Powershell, and run Get-SPDeletedSite, example below, Jane Smith site was deleted.  Now run the Restore-SPDeletedSite as below"  At this stage  the user site is back as is and there is no entry for this site  in the database table MySiteDeletionStatus If a manager was assigned as secondary site contact, then from the Central Administration, remove the manager as the Secondary Site Contact. This should get the User Site Collection re-instated as is.
Netwoven   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:40pm</span>
5 reasons Back Office Operations are interested in workforce analytics & automation Measure in Real-Time On average US employees waste 2 hours a day beyond breaks and lunch hour. Real-time workforce analytics will capture activity in real-time of all associates, even those at-home, to identify productive and unproductive practices. WorkiQ captures all counts, time, and outcomes of activity so work can be categorized and managed. Manage in Real-Time Employees perform at varying levels of productivity and efficiency based on training, engagement, experience, and even acute situations in their personal life. Effective managers need reliable operational intelligence to identify if workers need training or if they are not optimizing work hours. WorkiQ provides the operational intelligence needed to identify, improve, and reward employ­ees through real-time management dashboards. Improve in Real-Time Dramatic productivity improvements start with increased engagement. Through awareness, scorecards and gamification; WorkiQ work­force analytics delivers a wide range of reports that empower people at every level of the company to compete and engage. Through real-time metrics, as opposed to infrequent performance reviews, associates know how they are performing in com­parison to their peers, where they excel, and where they can improve. Managers can compare employees with accurate stan­dards, reward superstar performers, and see where their team ranks against other groups or departments. Optimizing labor costs Companies using data-driven decision-making were, on average, 5% more productive and 6% more profitable than their competitors. Back office operations largest cost is labor. By using WorkiQ, you are able to identify empty labor and recapture productive hours, identify the true need for overtime costs, and utilize real-time data to measure the ability to work the inventory. Robotic Process Automation A natural utilization of operational intelligence is identifying opportunities for robotic process automation (RPA). Identifying and replacing routine or repetitive back office work with software robots enables companies to save considerable expense. Insurance companies, for example, use robots for their claims / auto-adjudication improvement. With a complete solution to identify, configure and execute WorkiQ RPA provides a complete solution providing significant savings back to your company. The post Analytics and Automation for the Back Office appeared first on WorkiQ Blog.
WORKIQ   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:39pm</span>
Hello, I’m Eric Barela, another of the co-leaders of the Qualitative Methods TIG, and a co-editor with Leslie Goodyear, Jennifer Jewiss, and Janet Usinger of a new book about qualitative evaluation called Qualitative Inquiry in Evaluation: From Theory to Practice (2014, Jossey-Bass). In my time as an evaluator, I have noticed that discussions of methodology with clients can take on several forms. Most often, clients are genuinely interested in knowing how I collected and analyzed my data and why I made the methodological choices I did. However, clients have occasionally tried to use what I like to call "methodological red herrings" to dispute less-than-positive findings. I once worked with a client who disagreed with my findings because they were not uniformly positive. She accused me of analyzing only the data that would show the negative aspects of her program. I was able to show the codebook I had developed and how I went about developing the thematic content of the report based on my data analysis, which she was not prepared for me to do. I was able to defend my analytic process and get the bigwigs in the room to understand that, while there were some aspects of the program that could be improved, there were also many positive things happening. The happy ending is that the program continued to be funded, in part because of my client’s efforts to discredit my methodological choices! Lesson Learned: Include a detailed description of your qualitative inquiry process in evaluation reports. I include it as an appendix so it’s there for clients who really want to see it. It can take time to write a detailed account of your qualitative data collection and analysis processes, but it will be time well spent! Rad Resource: More stories about being in the trenches of qualitative inquiry in evaluation, and using detailed descriptions of qualitative inquiry choices and processes, can be found in the final chapter of our new book, Qualitative Inquiry in Evaluation: From Theory to Practice (2014, Jossey-Bass). The American Evaluation Association is celebrating Qualitative Evaluation Week. The contributions all this week to aea365 come from evaluators who do qualitative evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: QUAL Eval Week: Michael Quinn Patton on Qualitative Inquiry in Utilization-Focused Evaluation Michelle Searle on the Role of Arts in Evaluation CAP Week: Sandra Eames on Utilization Focused Evaluation
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:39pm</span>
On June 24, @SHRMNextchat chatted with Aliah Wright (@1SHRMScribe) about HR and Cybersecurity. In case you missed this important chat filled with tips and advice for protecting your organization's most sensitive information, you can read all the tweets below:   [View the story "#Nextchat RECAP: HR and Cybersecurity " on Storify]  ...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:39pm</span>
Introduction Variations is a SharePoint feature that permits you to synchronize content between different multi-lingual sites in SharePoint by translating the content from the source site to target. It’s a mechanism that can be used to serve the same content to multiple audiences. This feature is available in SharePoint 2013 as well as SharePoint Online. In SharePoint 2010, variations were also used to provide content specifically targeted towards audiences that used different devices (such as phones) or that required different branding. In SharePoint 2013, you can achieve those results using Device Channels. Implementing Variations       Variations are enabled and set at the site collection level and hence the settings for variations can be found on the Site Settings for the root site. If your site is a sub-site, you will need to click the Go to top level site settings link under Site Collection Administration group in the Site Settings page. There are 3 variation related links available under the Site Collection Administration settings - Variations Settings, Variation Labels and Variation logs             Variations Settings - The Variations Settings page lets you configure variations setup in your site such as - setting variations selectively or across the site collection, recreating deleted target pages, updating target page web parts and sending out notifications once the translation is complete and target pages are updated. Variation Labels - The Variation Labels page lets you create labels for the different languages that you want to support in your site(s) and the hierarchies between them. Variation Logs - As the name suggests, the Variation Logs page lets you view the logs of the content that was translated. Broadly, the steps required to setup and configure Variations are - Plan and create a variations hierarchy to synchronize the content from the source site to destination site(s) Create a site that supports variations Setup appropriate variation labels Configure the variation hierarchy between the required variation labels  To be able to create variations in a SharePoint site, the site either Needs to be created using one of the Publishing Site templates or should have the Publishing Infrastructure feature activated Following section talks about how to create a label and walk you through the translation process using Microsoft Translation service that comes out of box. Create Target Label Edit / Add content in Source Label Content updates should appear in Target Label Translate content in Target label using Microsoft Translation Service  Step by Step walkthrough:  1. Create Target label (French in this case): a) Go to Site Settings -&gt; Variation Labels and click on "New Label" link. b) Select Site Template Language and Locale c) Click Next to enter Label Name and Display Name d) Select Translation Options, In this case French: e) I would like the target to be updated as soon as the source is updated: f) Review page. Click Finish: g) You should see the new label. And Hierarchy column should be "Yes". If it says "No", click on "Create Hierarchies" link 2. Add / Edit Content in source Label For demo purpose, I added content to default.aspx page of the site then checked-in, published and approved (typical content authoring workflow process) the page. This is how the page looks after approval. Got to pages library and select the default.aspx page and click on "update all targets" link in ribbon. Give it few minutes for system to sync the changes to all target labels. There we have it! Once the content reflects in Target Label, You can use the Microsoft Translation service to translate the content for you. Or you can do the translation manually. (Manual translation workflow is not the scope of this article).  Click the button highlighted below to use Microsoft Translation service. Translation is an asynchronous operation. You will get an email notification as soon as translation is complete. Finally, we see the content translated to French. You can publish the page to show that for your site users. Hope this helps and Thank you.
Netwoven   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:39pm</span>
Identifying top performing teams and individuals is critical to building a culture of accountability and high employee engagement.  Real-time desktop analytics provides the operational intelligence necessary to evaluate true staffing needs, reduce outsourcing, and lower the overall costs of operations. By providing dashboards with insights into the actual performance your managers will be empowered to make "in the moment" coaching and guidance for optimal performance. Using your own data learn how your operations will save.   Try our new online Savings Calculator. Three areas you can find savings. Productive Hours On average US employees waste 2 hours a day beyond breaks and lunch hour.  If your organization can re-capture part or all of this "empty labor" productivity will increase and your organization can do more work with the same staff.  WorkiQ provides real-time data showing the amount of time spent on productive and non-productive activities and categorizing the type of work that consumes the most labor hours. Reducing overtime According to a recent survey, Americans work an average of one hour of overtime each week.  Sometimes your the business may need overtime to get through peak periods but how do you truly know without accurate data?  WorkiQ provides real-time data showing where you may need more, or less, of the (work being done). Eliminating self-reporting Many companies only have self-reporting methods or use disparate data from multiple core systems to track the amount of work and time spent on various tasks.   These self-reporting methods rob your employees of time that could be spent doing real work. Try our ROI tool and let us know what you think. WorkiQ Savings Calculator The post ROI from Real-Time Desktop Analytics appeared first on WorkiQ Blog.
WORKIQ   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:39pm</span>
As the Community Manager for AEA, I am the voice behind our twitter page. Joining Twitter and interacting on the site can sometimes be a daunting task. I am here to show you that it is easy to join twitter and actively engage other users. So, why join Twitter?. The site is a great resource for both your professional and personal life.  Your colleagues and friends are on Twitter and you don’t want to miss out on the conversations. Rad Resource:  Here are the top five reasons to join Twitter You control the content Unlike Facebook, where you can’t always control the posts that you want to see in your newsfeed, Twitter is more of a one-way street -if someone follows you, you’re not automatically obligated to read about his or her life. You can choose who you want to follow and what you want your twitter feed to focus on.  For example, if you want your twitter feed to focus on evaluation, then follow other evaluation professionals who tweet about topics that resonate with your interests.  Click here to see a past blog post with a list of evaluators you can follow. A common misconception: Just because you are on Twitter, does not mean you have to see what Kim Kardashian is eating for breakfast—you have to choose to follow her to get this exclusive scoop. It’s a news source Twitter can help you stay up-to-date on evaluation trends and the latest evaluation news. Twitter members post articles, interesting facts, and tips and tricks focused on creating better evaluations. You can use a hashtag to follow certain trends. Popular hashtags that we follow are: #eval, #YearofEval, #dataviz. Twitter is a great resource for Networking Twitter is a great place to find other professionals who share your interests. A big draw for the site is that it connects everyone from CEOs to comedians with everyday people. Follow people  that you find interesting and start a conversation with them. Bounce evaluation techniques and ideas off each other. Before you know it, you have created a strong network of evaluation supporters, professionals, and leaders. Stay connected at conferences Twitter is a great resource when you are attending a conference. Most conferences have a hashtag for their event (ie: #Eval15 for Evaluation 2015) which you can follow on Twitter and stay up-to-date on conference news and announcements. Tweeting can be a great way to reflect on your learning while attending a conference and can provide a useful record of key points. Tweet good sound bites, bits of new knowledge, quotes from presenters, your own opinions or connections you are making, or interesting facts or statistics. This provides a great summary of the event and helps others gain more out of the conference—especially if they were not able to attend or missed a session. Click here to see a past AEA365 post about the success of last year’s event hashtag (#Eval14) Its only 140 characters! If you’re anything like me, you can read novels, in-depth features, and articles several thousand words long, but there are times when you’d rather not have to. Twitter is short, sweet, and straight to the point! It also presents a fun challenge—express yourself in 140 characters or less. Be sure to follow AEA on Twitter(@aeaweb) . If you’re already connected, please feel free to print this out and give it to a colleague of yours. They just might be interested in joining the conversation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Dan McDonnell on Making New Friends and Mastering Lesser-Known Twitter Features Without Third Party Apps Dan McDonnell on Using Twitter to Enhance Your Conference Experience Dan McDonnell on Getting More Out of Twitter Hashtags
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 12:39pm</span>
Displaying 29521 - 29544 of 43689 total records